ansible-playbook [core 2.17.13] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-gHt executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Jun 4 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-7)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Wednesday 30 July 2025 11:21:56 -0400 (0:00:00.435) 0:00:00.435 ******** [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node2] TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Wednesday 30 July 2025 11:22:01 -0400 (0:00:05.303) 0:00:05.738 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Wednesday 30 July 2025 11:22:01 -0400 (0:00:00.193) 0:00:05.932 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Wednesday 30 July 2025 11:22:02 -0400 (0:00:00.177) 0:00:06.110 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Wednesday 30 July 2025 11:22:02 -0400 (0:00:00.250) 0:00:06.360 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Wednesday 30 July 2025 11:22:02 -0400 (0:00:00.230) 0:00:06.591 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Wednesday 30 July 2025 11:22:02 -0400 (0:00:00.167) 0:00:06.759 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Wednesday 30 July 2025 11:22:02 -0400 (0:00:00.181) 0:00:06.940 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Wednesday 30 July 2025 11:22:03 -0400 (0:00:00.220) 0:00:07.160 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:22:03 -0400 (0:00:00.531) 0:00:07.692 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:22:03 -0400 (0:00:00.305) 0:00:07.997 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:22:04 -0400 (0:00:00.585) 0:00:08.582 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:22:05 -0400 (0:00:00.715) 0:00:09.298 ******** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:22:07 -0400 (0:00:02.350) 0:00:11.653 ******** ok: [managed-node2] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:22:07 -0400 (0:00:00.298) 0:00:11.951 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:22:08 -0400 (0:00:00.160) 0:00:12.111 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:22:08 -0400 (0:00:00.175) 0:00:12.287 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:22:09 -0400 (0:00:00.761) 0:00:13.049 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:22:13 -0400 (0:00:04.314) 0:00:17.363 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:22:13 -0400 (0:00:00.514) 0:00:17.878 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:22:14 -0400 (0:00:00.462) 0:00:18.341 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:22:18 -0400 (0:00:03.716) 0:00:22.058 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:22:18 -0400 (0:00:00.349) 0:00:22.407 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:22:18 -0400 (0:00:00.446) 0:00:22.854 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:22:19 -0400 (0:00:00.410) 0:00:23.264 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:22:19 -0400 (0:00:00.386) 0:00:23.651 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:22:22 -0400 (0:00:02.512) 0:00:26.163 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:22:26 -0400 (0:00:04.267) 0:00:30.436 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:22:27 -0400 (0:00:00.780) 0:00:31.216 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:22:27 -0400 (0:00:00.192) 0:00:31.409 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:22:29 -0400 (0:00:01.755) 0:00:33.164 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:22:29 -0400 (0:00:00.471) 0:00:33.636 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753888872.600405, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "db08cd98a8de40ca88168d76d8ecef16051ddb48", "ctime": 1753888870.9213874, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753888870.9213874, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:22:30 -0400 (0:00:01.088) 0:00:34.724 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:22:30 -0400 (0:00:00.292) 0:00:35.017 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:22:31 -0400 (0:00:00.188) 0:00:35.205 ******** ok: [managed-node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:22:31 -0400 (0:00:00.313) 0:00:35.518 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:22:31 -0400 (0:00:00.335) 0:00:35.854 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:22:32 -0400 (0:00:00.252) 0:00:36.107 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:22:32 -0400 (0:00:00.628) 0:00:36.736 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:22:33 -0400 (0:00:00.509) 0:00:37.245 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:22:33 -0400 (0:00:00.676) 0:00:37.922 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:22:34 -0400 (0:00:00.501) 0:00:38.423 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:22:34 -0400 (0:00:00.557) 0:00:38.981 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753887505.1430779, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753191233.179, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1753190915.528, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1227553237", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:22:36 -0400 (0:00:01.242) 0:00:40.223 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:22:36 -0400 (0:00:00.200) 0:00:40.424 ******** ok: [managed-node2] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:76 Wednesday 30 July 2025 11:22:38 -0400 (0:00:02.017) 0:00:42.441 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node2 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Wednesday 30 July 2025 11:22:38 -0400 (0:00:00.545) 0:00:42.986 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Wednesday 30 July 2025 11:22:41 -0400 (0:00:02.305) 0:00:45.291 ******** ok: [managed-node2] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Wednesday 30 July 2025 11:22:44 -0400 (0:00:03.220) 0:00:48.512 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "'Unable to find unused disk' in unused_disks_return.disks", "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Wednesday 30 July 2025 11:22:44 -0400 (0:00:00.233) 0:00:48.745 ******** ok: [managed-node2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Wednesday 30 July 2025 11:22:44 -0400 (0:00:00.267) 0:00:49.012 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "unused_disks | d([]) | length < disks_needed | d(1)", "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Wednesday 30 July 2025 11:22:45 -0400 (0:00:00.531) 0:00:49.543 ******** ok: [managed-node2] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:85 Wednesday 30 July 2025 11:22:45 -0400 (0:00:00.298) 0:00:49.842 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 30 July 2025 11:22:46 -0400 (0:00:00.528) 0:00:50.371 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 30 July 2025 11:22:46 -0400 (0:00:00.627) 0:00:50.998 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:22:47 -0400 (0:00:00.456) 0:00:51.455 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:22:47 -0400 (0:00:00.467) 0:00:51.922 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:22:49 -0400 (0:00:01.550) 0:00:53.472 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:22:50 -0400 (0:00:00.792) 0:00:54.265 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:22:50 -0400 (0:00:00.424) 0:00:54.689 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:22:51 -0400 (0:00:00.381) 0:00:55.071 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:22:51 -0400 (0:00:00.289) 0:00:55.360 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:22:51 -0400 (0:00:00.249) 0:00:55.610 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:22:52 -0400 (0:00:00.754) 0:00:56.364 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:22:54 -0400 (0:00:02.403) 0:00:58.790 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:22:55 -0400 (0:00:00.576) 0:00:59.366 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:22:55 -0400 (0:00:00.546) 0:00:59.913 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:22:58 -0400 (0:00:02.329) 0:01:02.242 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:22:58 -0400 (0:00:00.558) 0:01:02.801 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:22:59 -0400 (0:00:00.448) 0:01:03.250 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:22:59 -0400 (0:00:00.551) 0:01:03.801 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:23:00 -0400 (0:00:00.484) 0:01:04.285 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:23:02 -0400 (0:00:02.612) 0:01:06.897 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:23:05 -0400 (0:00:02.894) 0:01:09.791 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:23:06 -0400 (0:00:00.757) 0:01:10.549 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:23:06 -0400 (0:00:00.219) 0:01:10.769 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 30 July 2025 11:23:08 -0400 (0:00:02.093) 0:01:12.862 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'foo' missing key/password", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:23:09 -0400 (0:00:00.366) 0:01:13.229 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 30 July 2025 11:23:09 -0400 (0:00:00.198) 0:01:13.427 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 30 July 2025 11:23:09 -0400 (0:00:00.292) 0:01:13.720 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 30 July 2025 11:23:10 -0400 (0:00:00.392) 0:01:14.112 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:100 Wednesday 30 July 2025 11:23:10 -0400 (0:00:00.259) 0:01:14.371 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:23:11 -0400 (0:00:00.773) 0:01:15.144 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:23:11 -0400 (0:00:00.413) 0:01:15.558 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:23:12 -0400 (0:00:00.552) 0:01:16.111 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:23:12 -0400 (0:00:00.792) 0:01:16.904 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:23:13 -0400 (0:00:00.335) 0:01:17.272 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:23:13 -0400 (0:00:00.255) 0:01:17.527 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:23:13 -0400 (0:00:00.253) 0:01:17.781 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:23:13 -0400 (0:00:00.236) 0:01:18.017 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:23:14 -0400 (0:00:00.698) 0:01:18.716 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:23:17 -0400 (0:00:02.422) 0:01:21.139 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:23:17 -0400 (0:00:00.596) 0:01:21.736 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:23:18 -0400 (0:00:00.511) 0:01:22.247 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:23:20 -0400 (0:00:02.036) 0:01:24.284 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:23:20 -0400 (0:00:00.528) 0:01:24.812 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:23:21 -0400 (0:00:00.450) 0:01:25.263 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:23:21 -0400 (0:00:00.363) 0:01:25.626 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:23:21 -0400 (0:00:00.385) 0:01:26.012 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:23:24 -0400 (0:00:02.254) 0:01:28.266 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:23:27 -0400 (0:00:02.985) 0:01:31.254 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:23:27 -0400 (0:00:00.662) 0:01:31.916 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:23:28 -0400 (0:00:00.269) 0:01:32.185 ******** changed: [managed-node2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:23:38 -0400 (0:00:10.591) 0:01:42.778 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:23:39 -0400 (0:00:00.493) 0:01:43.272 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753888872.600405, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "db08cd98a8de40ca88168d76d8ecef16051ddb48", "ctime": 1753888870.9213874, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753888870.9213874, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:23:40 -0400 (0:00:01.234) 0:01:44.506 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:23:43 -0400 (0:00:03.310) 0:01:47.817 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:23:43 -0400 (0:00:00.201) 0:01:48.018 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:23:44 -0400 (0:00:00.927) 0:01:48.946 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:23:45 -0400 (0:00:00.310) 0:01:49.257 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:23:45 -0400 (0:00:00.324) 0:01:49.581 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:23:46 -0400 (0:00:00.672) 0:01:50.254 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:23:51 -0400 (0:00:05.516) 0:01:55.771 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:23:55 -0400 (0:00:03.595) 0:01:59.367 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:23:56 -0400 (0:00:00.782) 0:02:00.149 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:23:57 -0400 (0:00:01.800) 0:02:01.949 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753887505.1430779, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753191233.179, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1753190915.528, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1227553237", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:23:59 -0400 (0:00:01.363) 0:02:03.313 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:24:01 -0400 (0:00:01.728) 0:02:05.041 ******** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:112 Wednesday 30 July 2025 11:24:03 -0400 (0:00:02.063) 0:02:07.105 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:24:03 -0400 (0:00:00.737) 0:02:07.843 ******** skipping: [managed-node2] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:24:04 -0400 (0:00:00.607) 0:02:08.451 ******** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:24:05 -0400 (0:00:00.738) 0:02:09.189 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "size": "10G", "type": "crypt", "uuid": "f127f603-0a76-4790-8c14-0e602745d8ba" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "453b1ac7-19a4-4c29-b50b-1994fb6b8ca9" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:24:07 -0400 (0:00:02.692) 0:02:11.881 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002903", "end": "2025-07-30 11:24:10.260357", "rc": 0, "start": "2025-07-30 11:24:10.257454" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:24:10 -0400 (0:00:02.676) 0:02:14.558 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003053", "end": "2025-07-30 11:24:11.425036", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:24:11.421983" } STDOUT: luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:24:11 -0400 (0:00:01.164) 0:02:15.723 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:24:12 -0400 (0:00:00.579) 0:02:16.302 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:24:13 -0400 (0:00:00.857) 0:02:17.159 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:24:13 -0400 (0:00:00.665) 0:02:17.825 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:24:15 -0400 (0:00:01.919) 0:02:19.744 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:24:16 -0400 (0:00:00.417) 0:02:20.162 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:24:16 -0400 (0:00:00.650) 0:02:20.812 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:24:17 -0400 (0:00:00.682) 0:02:21.495 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:24:17 -0400 (0:00:00.346) 0:02:21.841 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:24:18 -0400 (0:00:00.601) 0:02:22.443 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:24:19 -0400 (0:00:00.682) 0:02:23.126 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:24:19 -0400 (0:00:00.670) 0:02:23.796 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:24:19 -0400 (0:00:00.159) 0:02:23.956 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:24:20 -0400 (0:00:00.267) 0:02:24.223 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:24:20 -0400 (0:00:00.256) 0:02:24.480 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:24:20 -0400 (0:00:00.265) 0:02:24.746 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:24:21 -0400 (0:00:00.939) 0:02:25.685 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:24:22 -0400 (0:00:00.634) 0:02:26.320 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:24:22 -0400 (0:00:00.684) 0:02:27.005 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:24:23 -0400 (0:00:00.537) 0:02:27.542 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:24:24 -0400 (0:00:00.618) 0:02:28.161 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:24:24 -0400 (0:00:00.253) 0:02:28.414 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:24:25 -0400 (0:00:00.640) 0:02:29.054 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:24:25 -0400 (0:00:00.616) 0:02:29.671 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889018.1369207, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889018.1369207, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 451, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1753889018.1369207, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:24:26 -0400 (0:00:01.345) 0:02:31.017 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:24:27 -0400 (0:00:00.373) 0:02:31.390 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:24:27 -0400 (0:00:00.224) 0:02:31.615 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:24:27 -0400 (0:00:00.289) 0:02:31.904 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:24:28 -0400 (0:00:00.277) 0:02:32.181 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:24:28 -0400 (0:00:00.235) 0:02:32.416 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:24:28 -0400 (0:00:00.297) 0:02:32.714 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889018.3829231, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889018.3829231, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 976, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753889018.3829231, "nlink": 1, "path": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:24:30 -0400 (0:00:01.329) 0:02:34.043 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:24:32 -0400 (0:00:02.481) 0:02:36.525 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007272", "end": "2025-07-30 11:24:33.522375", "rc": 0, "start": "2025-07-30 11:24:33.515103" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 453b1ac7-19a4-4c29-b50b-1994fb6b8ca9 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 673738 Threads: 2 Salt: 18 cb 92 fd 45 1d dc 5f 8f 02 d6 64 39 a9 3e 98 1c 04 1b ab d2 71 52 86 e1 55 b1 dd 92 57 a8 ef AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 129517 Salt: 34 4f 67 69 b6 ef 72 f0 7f cc 17 fd c2 55 9d 09 e1 f7 45 d5 81 80 49 4d 6c d7 d4 7c a4 c5 61 31 Digest: df 0c 35 c1 65 c9 47 57 de e6 3a b2 5b 8b e2 ea aa e3 45 94 b7 82 28 1f 17 6b 74 f9 03 ff a7 a9 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:24:33 -0400 (0:00:01.259) 0:02:37.784 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:24:34 -0400 (0:00:00.568) 0:02:38.353 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:24:34 -0400 (0:00:00.657) 0:02:39.011 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:24:35 -0400 (0:00:00.313) 0:02:39.325 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:24:36 -0400 (0:00:00.725) 0:02:40.050 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:24:36 -0400 (0:00:00.354) 0:02:40.404 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:24:36 -0400 (0:00:00.369) 0:02:40.774 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:24:37 -0400 (0:00:00.352) 0:02:41.126 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:24:37 -0400 (0:00:00.618) 0:02:41.745 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:24:38 -0400 (0:00:00.699) 0:02:42.444 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:24:39 -0400 (0:00:00.703) 0:02:43.148 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:24:39 -0400 (0:00:00.703) 0:02:43.851 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:24:40 -0400 (0:00:00.605) 0:02:44.457 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:24:40 -0400 (0:00:00.296) 0:02:44.753 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:24:40 -0400 (0:00:00.261) 0:02:45.015 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:24:41 -0400 (0:00:00.239) 0:02:45.254 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:24:41 -0400 (0:00:00.258) 0:02:45.513 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:24:41 -0400 (0:00:00.246) 0:02:45.773 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:24:41 -0400 (0:00:00.155) 0:02:45.929 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:24:42 -0400 (0:00:00.148) 0:02:46.078 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:24:42 -0400 (0:00:00.253) 0:02:46.331 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:24:42 -0400 (0:00:00.231) 0:02:46.562 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:24:42 -0400 (0:00:00.257) 0:02:46.820 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:24:42 -0400 (0:00:00.195) 0:02:47.016 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:24:43 -0400 (0:00:00.411) 0:02:47.427 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:24:43 -0400 (0:00:00.538) 0:02:47.965 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:24:44 -0400 (0:00:00.555) 0:02:48.521 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:24:44 -0400 (0:00:00.321) 0:02:48.842 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:24:45 -0400 (0:00:00.705) 0:02:49.547 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:24:46 -0400 (0:00:00.573) 0:02:50.121 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:24:46 -0400 (0:00:00.528) 0:02:50.650 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:24:47 -0400 (0:00:00.584) 0:02:51.234 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:24:47 -0400 (0:00:00.546) 0:02:51.781 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:24:48 -0400 (0:00:00.384) 0:02:52.165 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:24:48 -0400 (0:00:00.402) 0:02:52.568 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:24:48 -0400 (0:00:00.390) 0:02:52.959 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:24:49 -0400 (0:00:00.252) 0:02:53.212 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:24:49 -0400 (0:00:00.400) 0:02:53.613 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:24:49 -0400 (0:00:00.286) 0:02:53.899 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:24:50 -0400 (0:00:00.426) 0:02:54.325 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:24:50 -0400 (0:00:00.422) 0:02:54.747 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:24:51 -0400 (0:00:00.340) 0:02:55.087 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:24:51 -0400 (0:00:00.339) 0:02:55.427 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:24:51 -0400 (0:00:00.362) 0:02:55.790 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:24:52 -0400 (0:00:00.437) 0:02:56.227 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:24:52 -0400 (0:00:00.329) 0:02:56.556 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:24:52 -0400 (0:00:00.426) 0:02:56.983 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:24:53 -0400 (0:00:00.419) 0:02:57.402 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:24:53 -0400 (0:00:00.320) 0:02:57.723 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:24:53 -0400 (0:00:00.216) 0:02:57.940 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:24:54 -0400 (0:00:00.517) 0:02:58.457 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:24:54 -0400 (0:00:00.267) 0:02:58.724 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:24:55 -0400 (0:00:00.393) 0:02:59.117 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:24:55 -0400 (0:00:00.226) 0:02:59.344 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:24:55 -0400 (0:00:00.204) 0:02:59.549 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:24:55 -0400 (0:00:00.288) 0:02:59.837 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:24:56 -0400 (0:00:00.269) 0:03:00.107 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:24:56 -0400 (0:00:00.292) 0:03:00.399 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:24:56 -0400 (0:00:00.387) 0:03:00.787 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 30 July 2025 11:24:57 -0400 (0:00:00.295) 0:03:01.082 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:118 Wednesday 30 July 2025 11:25:00 -0400 (0:00:03.036) 0:03:04.119 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 30 July 2025 11:25:00 -0400 (0:00:00.694) 0:03:04.814 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 30 July 2025 11:25:01 -0400 (0:00:00.798) 0:03:05.612 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:25:02 -0400 (0:00:00.516) 0:03:06.128 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:25:02 -0400 (0:00:00.341) 0:03:06.470 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:25:03 -0400 (0:00:01.082) 0:03:07.552 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:25:04 -0400 (0:00:01.350) 0:03:08.903 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:25:05 -0400 (0:00:00.314) 0:03:09.218 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:25:05 -0400 (0:00:00.338) 0:03:09.557 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:25:05 -0400 (0:00:00.291) 0:03:09.848 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:25:06 -0400 (0:00:00.312) 0:03:10.161 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:25:07 -0400 (0:00:00.911) 0:03:11.072 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:25:09 -0400 (0:00:02.526) 0:03:13.599 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:25:10 -0400 (0:00:00.682) 0:03:14.281 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:25:10 -0400 (0:00:00.660) 0:03:14.942 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:25:13 -0400 (0:00:02.440) 0:03:17.383 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:25:13 -0400 (0:00:00.617) 0:03:18.000 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:25:14 -0400 (0:00:00.539) 0:03:18.540 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:25:15 -0400 (0:00:00.524) 0:03:19.065 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:25:15 -0400 (0:00:00.536) 0:03:19.601 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:25:18 -0400 (0:00:02.695) 0:03:22.297 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:25:22 -0400 (0:00:03.813) 0:03:26.111 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:25:23 -0400 (0:00:00.928) 0:03:27.039 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:25:23 -0400 (0:00:00.240) 0:03:27.280 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 30 July 2025 11:25:25 -0400 (0:00:02.509) 0:03:29.790 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:25:26 -0400 (0:00:00.393) 0:03:30.184 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 30 July 2025 11:25:26 -0400 (0:00:00.191) 0:03:30.395 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 30 July 2025 11:25:26 -0400 (0:00:00.305) 0:03:30.700 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 30 July 2025 11:25:27 -0400 (0:00:00.409) 0:03:31.110 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 30 July 2025 11:25:27 -0400 (0:00:00.349) 0:03:31.459 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889099.8777711, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753889099.8777711, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1753889099.8777711, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2687391841", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 30 July 2025 11:25:28 -0400 (0:00:01.199) 0:03:32.659 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:138 Wednesday 30 July 2025 11:25:29 -0400 (0:00:00.434) 0:03:33.093 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:25:30 -0400 (0:00:00.942) 0:03:34.036 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:25:30 -0400 (0:00:00.477) 0:03:34.514 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:25:31 -0400 (0:00:00.619) 0:03:35.134 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:25:31 -0400 (0:00:00.834) 0:03:35.995 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:25:32 -0400 (0:00:00.349) 0:03:36.344 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:25:32 -0400 (0:00:00.380) 0:03:36.724 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:25:32 -0400 (0:00:00.302) 0:03:37.027 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:25:33 -0400 (0:00:00.339) 0:03:37.366 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:25:34 -0400 (0:00:00.778) 0:03:38.145 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:25:36 -0400 (0:00:02.762) 0:03:40.907 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:25:37 -0400 (0:00:00.607) 0:03:41.514 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:25:38 -0400 (0:00:00.661) 0:03:42.176 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:25:40 -0400 (0:00:02.387) 0:03:44.563 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:25:41 -0400 (0:00:00.586) 0:03:45.150 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:25:41 -0400 (0:00:00.566) 0:03:45.717 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:25:42 -0400 (0:00:00.535) 0:03:46.252 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:25:42 -0400 (0:00:00.472) 0:03:46.724 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:25:45 -0400 (0:00:02.558) 0:03:49.282 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:25:48 -0400 (0:00:02.883) 0:03:52.166 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:25:48 -0400 (0:00:00.698) 0:03:52.864 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:25:49 -0400 (0:00:00.218) 0:03:53.083 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:25:51 -0400 (0:00:02.843) 0:03:55.927 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:25:52 -0400 (0:00:00.396) 0:03:56.324 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889035.111097, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4282a5aaa913d10e99001fac14ba93bb40a9042b", "ctime": 1753889035.108097, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753889035.108097, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:25:53 -0400 (0:00:00.999) 0:03:57.324 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:25:55 -0400 (0:00:02.007) 0:03:59.331 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:25:55 -0400 (0:00:00.281) 0:03:59.612 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:25:55 -0400 (0:00:00.372) 0:03:59.985 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:25:56 -0400 (0:00:00.316) 0:04:00.322 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:25:56 -0400 (0:00:00.326) 0:04:00.649 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:25:58 -0400 (0:00:01.748) 0:04:02.397 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:26:00 -0400 (0:00:01.897) 0:04:04.295 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': 'UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:26:01 -0400 (0:00:01.331) 0:04:05.626 ******** skipping: [managed-node2] => (item={'src': 'UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:26:02 -0400 (0:00:00.728) 0:04:06.354 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:26:03 -0400 (0:00:01.515) 0:04:07.870 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889051.424267, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6371a127ea4a86df828d530ea7c5a0c8c82c3588", "ctime": 1753889040.747156, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 377487569, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1753889040.7478693, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "3103597905", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:26:04 -0400 (0:00:01.155) 0:04:09.026 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:26:06 -0400 (0:00:01.484) 0:04:10.510 ******** ok: [managed-node2] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:151 Wednesday 30 July 2025 11:26:08 -0400 (0:00:01.953) 0:04:12.464 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:26:09 -0400 (0:00:00.762) 0:04:13.226 ******** skipping: [managed-node2] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:26:09 -0400 (0:00:00.684) 0:04:13.911 ******** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:26:10 -0400 (0:00:00.756) 0:04:14.667 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ffce2b95-3173-4b3e-9fff-8f6521887e71" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:26:11 -0400 (0:00:01.193) 0:04:15.860 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003386", "end": "2025-07-30 11:26:12.865802", "rc": 0, "start": "2025-07-30 11:26:12.862416" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:26:13 -0400 (0:00:01.238) 0:04:17.098 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002979", "end": "2025-07-30 11:26:14.144402", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:26:14.141423" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:26:14 -0400 (0:00:01.272) 0:04:18.370 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:26:14 -0400 (0:00:00.471) 0:04:18.842 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:26:15 -0400 (0:00:00.824) 0:04:19.667 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:26:16 -0400 (0:00:00.575) 0:04:20.242 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:26:17 -0400 (0:00:01.774) 0:04:22.017 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:26:18 -0400 (0:00:00.427) 0:04:22.444 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:26:19 -0400 (0:00:00.726) 0:04:23.170 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:26:19 -0400 (0:00:00.725) 0:04:23.896 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:26:20 -0400 (0:00:00.404) 0:04:24.300 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:26:21 -0400 (0:00:00.756) 0:04:25.057 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:26:21 -0400 (0:00:00.693) 0:04:25.751 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:26:22 -0400 (0:00:00.736) 0:04:26.488 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:26:22 -0400 (0:00:00.280) 0:04:26.768 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:26:22 -0400 (0:00:00.243) 0:04:27.012 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:26:23 -0400 (0:00:00.196) 0:04:27.209 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:26:23 -0400 (0:00:00.248) 0:04:27.457 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:26:24 -0400 (0:00:00.921) 0:04:28.379 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:26:25 -0400 (0:00:00.671) 0:04:29.050 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:26:25 -0400 (0:00:00.620) 0:04:29.671 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:26:26 -0400 (0:00:00.571) 0:04:30.243 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:26:26 -0400 (0:00:00.623) 0:04:30.866 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:26:27 -0400 (0:00:00.290) 0:04:31.157 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:26:27 -0400 (0:00:00.705) 0:04:31.862 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:26:28 -0400 (0:00:00.710) 0:04:32.572 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889151.5953088, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889151.5953088, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 451, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1753889151.5953088, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:26:29 -0400 (0:00:01.224) 0:04:33.797 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:26:30 -0400 (0:00:00.419) 0:04:34.216 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:26:30 -0400 (0:00:00.265) 0:04:34.482 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:26:30 -0400 (0:00:00.370) 0:04:34.852 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:26:31 -0400 (0:00:00.303) 0:04:35.156 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:26:31 -0400 (0:00:00.286) 0:04:35.442 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:26:31 -0400 (0:00:00.340) 0:04:35.783 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:26:32 -0400 (0:00:00.312) 0:04:36.095 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:26:34 -0400 (0:00:02.386) 0:04:38.481 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:26:34 -0400 (0:00:00.198) 0:04:38.679 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:26:34 -0400 (0:00:00.238) 0:04:38.918 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:26:35 -0400 (0:00:00.683) 0:04:39.602 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:26:35 -0400 (0:00:00.312) 0:04:39.914 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:26:36 -0400 (0:00:00.316) 0:04:40.231 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:26:36 -0400 (0:00:00.795) 0:04:41.026 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:26:37 -0400 (0:00:00.239) 0:04:41.266 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:26:37 -0400 (0:00:00.309) 0:04:41.575 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:26:38 -0400 (0:00:00.693) 0:04:42.269 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:26:38 -0400 (0:00:00.637) 0:04:42.906 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:26:39 -0400 (0:00:00.558) 0:04:43.465 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:26:40 -0400 (0:00:00.580) 0:04:44.045 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:26:40 -0400 (0:00:00.546) 0:04:44.592 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:26:40 -0400 (0:00:00.255) 0:04:44.848 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:26:41 -0400 (0:00:00.209) 0:04:45.057 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:26:41 -0400 (0:00:00.177) 0:04:45.235 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:26:41 -0400 (0:00:00.193) 0:04:45.428 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:26:41 -0400 (0:00:00.255) 0:04:45.684 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:26:41 -0400 (0:00:00.232) 0:04:45.916 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:26:42 -0400 (0:00:00.284) 0:04:46.200 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:26:42 -0400 (0:00:00.209) 0:04:46.409 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:26:42 -0400 (0:00:00.198) 0:04:46.607 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:26:42 -0400 (0:00:00.238) 0:04:46.846 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:26:43 -0400 (0:00:00.252) 0:04:47.099 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:26:43 -0400 (0:00:00.495) 0:04:47.594 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:26:44 -0400 (0:00:00.443) 0:04:48.038 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:26:44 -0400 (0:00:00.429) 0:04:48.467 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:26:44 -0400 (0:00:00.309) 0:04:48.777 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:26:45 -0400 (0:00:00.614) 0:04:49.392 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:26:45 -0400 (0:00:00.599) 0:04:49.991 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:26:46 -0400 (0:00:00.542) 0:04:50.533 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:26:47 -0400 (0:00:00.510) 0:04:51.044 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:26:47 -0400 (0:00:00.442) 0:04:51.486 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:26:47 -0400 (0:00:00.331) 0:04:51.817 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:26:48 -0400 (0:00:00.251) 0:04:52.069 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:26:48 -0400 (0:00:00.444) 0:04:52.514 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:26:48 -0400 (0:00:00.303) 0:04:52.817 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:26:49 -0400 (0:00:00.314) 0:04:53.132 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:26:49 -0400 (0:00:00.344) 0:04:53.477 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:26:49 -0400 (0:00:00.346) 0:04:53.823 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:26:50 -0400 (0:00:00.297) 0:04:54.121 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:26:50 -0400 (0:00:00.374) 0:04:54.495 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:26:50 -0400 (0:00:00.340) 0:04:54.835 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:26:51 -0400 (0:00:00.332) 0:04:55.168 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:26:51 -0400 (0:00:00.411) 0:04:55.580 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:26:51 -0400 (0:00:00.359) 0:04:55.939 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:26:52 -0400 (0:00:00.337) 0:04:56.277 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:26:52 -0400 (0:00:00.500) 0:04:56.777 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:26:53 -0400 (0:00:00.323) 0:04:57.100 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:26:53 -0400 (0:00:00.342) 0:04:57.443 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:26:53 -0400 (0:00:00.583) 0:04:58.026 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:26:54 -0400 (0:00:00.328) 0:04:58.355 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:26:54 -0400 (0:00:00.267) 0:04:58.622 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:26:54 -0400 (0:00:00.290) 0:04:58.913 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:26:55 -0400 (0:00:00.224) 0:04:59.138 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:26:55 -0400 (0:00:00.230) 0:04:59.368 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:26:55 -0400 (0:00:00.254) 0:04:59.623 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:26:55 -0400 (0:00:00.314) 0:04:59.937 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:26:56 -0400 (0:00:00.290) 0:05:00.228 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 30 July 2025 11:26:56 -0400 (0:00:00.335) 0:05:00.564 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:157 Wednesday 30 July 2025 11:26:57 -0400 (0:00:01.369) 0:05:01.933 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 30 July 2025 11:26:58 -0400 (0:00:00.723) 0:05:02.657 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 30 July 2025 11:26:59 -0400 (0:00:00.712) 0:05:03.369 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:26:59 -0400 (0:00:00.508) 0:05:03.877 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:27:00 -0400 (0:00:00.430) 0:05:04.314 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:27:01 -0400 (0:00:00.726) 0:05:05.041 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:27:01 -0400 (0:00:00.868) 0:05:05.910 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:27:02 -0400 (0:00:00.870) 0:05:06.780 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:27:03 -0400 (0:00:00.331) 0:05:07.112 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:27:03 -0400 (0:00:00.245) 0:05:07.358 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:27:03 -0400 (0:00:00.337) 0:05:07.696 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:27:04 -0400 (0:00:00.929) 0:05:08.625 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:27:07 -0400 (0:00:02.677) 0:05:11.303 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:27:07 -0400 (0:00:00.694) 0:05:11.999 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:27:08 -0400 (0:00:00.625) 0:05:12.624 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:27:10 -0400 (0:00:02.294) 0:05:14.919 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:27:11 -0400 (0:00:00.580) 0:05:15.499 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:27:12 -0400 (0:00:00.590) 0:05:16.090 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:27:12 -0400 (0:00:00.457) 0:05:16.548 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:27:13 -0400 (0:00:00.493) 0:05:17.041 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:27:15 -0400 (0:00:02.689) 0:05:19.731 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service": { "name": "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:27:19 -0400 (0:00:04.259) 0:05:23.993 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:27:20 -0400 (0:00:00.792) 0:05:24.785 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d453b1ac7\x2d19a4\x2d4c29\x2db50b\x2d1994fb6b8ca9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "name": "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket dev-sda.device systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d453b1ac7\\\\x2d19a4\\\\x2d4c29\\\\x2db50b\\\\x2d1994fb6b8ca9.target\" umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9 /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d453b1ac7\\\\x2d19a4\\\\x2d4c29\\\\x2db50b\\\\x2d1994fb6b8ca9.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:26:03 EDT", "StateChangeTimestampMonotonic": "2069927631", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d453b1ac7\\\\x2d19a4\\\\x2d4c29\\\\x2db50b\\\\x2d1994fb6b8ca9.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:27:22 -0400 (0:00:01.771) 0:05:26.556 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 30 July 2025 11:27:24 -0400 (0:00:02.470) 0:05:29.026 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:27:25 -0400 (0:00:00.426) 0:05:29.453 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d453b1ac7\x2d19a4\x2d4c29\x2db50b\x2d1994fb6b8ca9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "name": "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454152192", "LimitMEMLOCKSoft": "454152192", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d453b1ac7\\x2d19a4\\x2d4c29\\x2db50b\\x2d1994fb6b8ca9.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d453b1ac7\\\\x2d19a4\\\\x2d4c29\\\\x2db50b\\\\x2d1994fb6b8ca9.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 30 July 2025 11:27:27 -0400 (0:00:01.698) 0:05:31.151 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 30 July 2025 11:27:27 -0400 (0:00:00.304) 0:05:31.455 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 30 July 2025 11:27:27 -0400 (0:00:00.448) 0:05:31.904 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 30 July 2025 11:27:28 -0400 (0:00:00.360) 0:05:32.265 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889217.6619952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753889217.6619952, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1753889217.6619952, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "740027803", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 30 July 2025 11:27:29 -0400 (0:00:01.326) 0:05:33.591 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:177 Wednesday 30 July 2025 11:27:29 -0400 (0:00:00.398) 0:05:33.990 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:27:31 -0400 (0:00:01.179) 0:05:35.169 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:27:31 -0400 (0:00:00.516) 0:05:35.686 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:27:32 -0400 (0:00:00.577) 0:05:36.264 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:27:33 -0400 (0:00:00.972) 0:05:37.236 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:27:33 -0400 (0:00:00.330) 0:05:37.567 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:27:33 -0400 (0:00:00.359) 0:05:37.927 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:27:34 -0400 (0:00:00.259) 0:05:38.186 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:27:34 -0400 (0:00:00.215) 0:05:38.402 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:27:35 -0400 (0:00:00.640) 0:05:39.042 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:27:37 -0400 (0:00:02.657) 0:05:41.700 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:27:38 -0400 (0:00:00.542) 0:05:42.242 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:27:38 -0400 (0:00:00.503) 0:05:42.746 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:27:40 -0400 (0:00:02.228) 0:05:44.975 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:27:41 -0400 (0:00:00.506) 0:05:45.503 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:27:42 -0400 (0:00:00.532) 0:05:46.035 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:27:42 -0400 (0:00:00.518) 0:05:46.553 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:27:43 -0400 (0:00:00.556) 0:05:47.110 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:27:45 -0400 (0:00:02.634) 0:05:49.744 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:27:48 -0400 (0:00:03.053) 0:05:52.797 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:27:49 -0400 (0:00:00.710) 0:05:53.508 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:27:49 -0400 (0:00:00.289) 0:05:53.798 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-6b89efb3-c071-4e76-a8c4-14387183db86", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:28:01 -0400 (0:00:11.989) 0:06:05.787 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:28:02 -0400 (0:00:00.579) 0:06:06.367 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889161.3604102, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ccc9a30d8b3a62d882018fca302e77cb3cd2d86e", "ctime": 1753889161.3574102, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753889161.3574102, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1436, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:28:03 -0400 (0:00:01.267) 0:06:07.634 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:28:05 -0400 (0:00:01.726) 0:06:09.361 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:28:05 -0400 (0:00:00.251) 0:06:09.612 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-6b89efb3-c071-4e76-a8c4-14387183db86", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:28:05 -0400 (0:00:00.398) 0:06:10.010 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:28:06 -0400 (0:00:00.338) 0:06:10.349 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:28:06 -0400 (0:00:00.346) 0:06:10.696 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': 'UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ffce2b95-3173-4b3e-9fff-8f6521887e71" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:28:08 -0400 (0:00:01.640) 0:06:12.337 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:28:10 -0400 (0:00:01.847) 0:06:14.185 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:28:11 -0400 (0:00:01.685) 0:06:15.893 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:28:12 -0400 (0:00:00.706) 0:06:16.601 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:28:14 -0400 (0:00:01.698) 0:06:18.299 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889174.143543, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753889166.260461, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 616565129, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1753889166.2613792, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1123729004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:28:15 -0400 (0:00:01.147) 0:06:19.447 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-6b89efb3-c071-4e76-a8c4-14387183db86', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-6b89efb3-c071-4e76-a8c4-14387183db86", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:28:16 -0400 (0:00:01.474) 0:06:20.922 ******** ok: [managed-node2] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:190 Wednesday 30 July 2025 11:28:18 -0400 (0:00:01.839) 0:06:22.762 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:28:19 -0400 (0:00:00.791) 0:06:23.553 ******** skipping: [managed-node2] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:28:20 -0400 (0:00:00.512) 0:06:24.066 ******** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:28:20 -0400 (0:00:00.498) 0:06:24.565 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "size": "10G", "type": "crypt", "uuid": "32a8d4fc-60f3-4383-a570-0fabcd70aa3b" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "6b89efb3-c071-4e76-a8c4-14387183db86" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:28:21 -0400 (0:00:01.211) 0:06:25.776 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002983", "end": "2025-07-30 11:28:22.522114", "rc": 0, "start": "2025-07-30 11:28:22.519131" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:28:22 -0400 (0:00:00.993) 0:06:26.769 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003008", "end": "2025-07-30 11:28:23.778006", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:28:23.774998" } STDOUT: luks-6b89efb3-c071-4e76-a8c4-14387183db86 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:28:24 -0400 (0:00:01.280) 0:06:28.050 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:28:24 -0400 (0:00:00.617) 0:06:28.668 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:28:25 -0400 (0:00:00.825) 0:06:29.493 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:28:26 -0400 (0:00:00.636) 0:06:30.130 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:28:27 -0400 (0:00:01.750) 0:06:31.881 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:28:28 -0400 (0:00:00.411) 0:06:32.292 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:28:28 -0400 (0:00:00.653) 0:06:32.946 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:28:29 -0400 (0:00:00.657) 0:06:33.604 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:28:29 -0400 (0:00:00.282) 0:06:33.887 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:28:30 -0400 (0:00:00.724) 0:06:34.612 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:28:31 -0400 (0:00:00.744) 0:06:35.356 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:28:31 -0400 (0:00:00.661) 0:06:36.018 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:28:32 -0400 (0:00:00.296) 0:06:36.314 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:28:32 -0400 (0:00:00.296) 0:06:36.611 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:28:32 -0400 (0:00:00.322) 0:06:36.934 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:28:33 -0400 (0:00:00.247) 0:06:37.190 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:28:34 -0400 (0:00:00.861) 0:06:38.052 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:28:34 -0400 (0:00:00.633) 0:06:38.686 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:28:35 -0400 (0:00:00.597) 0:06:39.284 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:28:35 -0400 (0:00:00.712) 0:06:39.996 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:28:36 -0400 (0:00:00.638) 0:06:40.635 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:28:36 -0400 (0:00:00.328) 0:06:40.964 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:28:37 -0400 (0:00:00.684) 0:06:41.648 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:28:38 -0400 (0:00:00.501) 0:06:42.149 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889281.2176557, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889281.2176557, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 451, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1753889281.2176557, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:28:39 -0400 (0:00:01.238) 0:06:43.387 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:28:39 -0400 (0:00:00.407) 0:06:43.795 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:28:40 -0400 (0:00:00.250) 0:06:44.046 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:28:40 -0400 (0:00:00.386) 0:06:44.432 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:28:40 -0400 (0:00:00.276) 0:06:44.708 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:28:40 -0400 (0:00:00.250) 0:06:44.959 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:28:41 -0400 (0:00:00.327) 0:06:45.287 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889281.445658, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889281.445658, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1104, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753889281.445658, "nlink": 1, "path": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:28:42 -0400 (0:00:01.156) 0:06:46.444 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:28:44 -0400 (0:00:02.418) 0:06:48.863 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007456", "end": "2025-07-30 11:28:45.749049", "rc": 0, "start": "2025-07-30 11:28:45.741593" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 6b89efb3-c071-4e76-a8c4-14387183db86 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 676865 Threads: 2 Salt: ad be e6 0b 0d 3f 49 41 18 2c 53 8a df 5c 3d 37 fc 69 7f ee 70 7f b2 2f 07 0f 78 4f 91 fb bb 37 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 129517 Salt: 07 04 37 93 76 cd c3 7a 56 dd 97 10 82 30 14 ed 3e da a3 b9 ca 34 82 b1 db 77 a7 be 79 3f 7e da Digest: 2a 83 06 d7 3d 7e 06 ec 3c a2 fd 43 87 2a bf 3c 4e 46 ab 7b d8 4b e8 7d 7d 8c 6e 10 37 8e 9a 28 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:28:45 -0400 (0:00:01.157) 0:06:50.020 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:28:46 -0400 (0:00:00.651) 0:06:50.671 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:28:47 -0400 (0:00:00.722) 0:06:51.393 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:28:47 -0400 (0:00:00.335) 0:06:51.729 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:28:48 -0400 (0:00:00.963) 0:06:52.692 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:28:48 -0400 (0:00:00.337) 0:06:53.030 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:28:49 -0400 (0:00:00.358) 0:06:53.389 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:28:49 -0400 (0:00:00.338) 0:06:53.727 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-6b89efb3-c071-4e76-a8c4-14387183db86 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:28:50 -0400 (0:00:00.647) 0:06:54.375 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:28:51 -0400 (0:00:00.694) 0:06:55.070 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:28:51 -0400 (0:00:00.751) 0:06:55.822 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:28:52 -0400 (0:00:00.732) 0:06:56.555 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:28:53 -0400 (0:00:00.559) 0:06:57.114 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:28:53 -0400 (0:00:00.282) 0:06:57.396 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:28:53 -0400 (0:00:00.215) 0:06:57.612 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:28:53 -0400 (0:00:00.195) 0:06:57.807 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:28:54 -0400 (0:00:00.260) 0:06:58.068 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:28:54 -0400 (0:00:00.300) 0:06:58.369 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:28:54 -0400 (0:00:00.243) 0:06:58.612 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:28:54 -0400 (0:00:00.224) 0:06:58.837 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:28:55 -0400 (0:00:00.221) 0:06:59.059 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:28:55 -0400 (0:00:00.185) 0:06:59.244 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:28:55 -0400 (0:00:00.302) 0:06:59.547 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:28:55 -0400 (0:00:00.267) 0:06:59.815 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:28:56 -0400 (0:00:00.532) 0:07:00.347 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:28:56 -0400 (0:00:00.453) 0:07:00.801 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:28:57 -0400 (0:00:00.568) 0:07:01.369 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:28:57 -0400 (0:00:00.346) 0:07:01.715 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:28:58 -0400 (0:00:00.636) 0:07:02.351 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:28:58 -0400 (0:00:00.599) 0:07:02.962 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:28:59 -0400 (0:00:00.538) 0:07:03.500 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:29:00 -0400 (0:00:00.582) 0:07:04.082 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:29:00 -0400 (0:00:00.592) 0:07:04.675 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:29:00 -0400 (0:00:00.292) 0:07:04.967 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:29:01 -0400 (0:00:00.353) 0:07:05.321 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:29:01 -0400 (0:00:00.283) 0:07:05.604 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:29:01 -0400 (0:00:00.326) 0:07:05.931 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:29:02 -0400 (0:00:00.296) 0:07:06.227 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:29:02 -0400 (0:00:00.359) 0:07:06.587 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:29:02 -0400 (0:00:00.383) 0:07:06.975 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:29:03 -0400 (0:00:00.404) 0:07:07.380 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:29:03 -0400 (0:00:00.385) 0:07:07.766 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:29:04 -0400 (0:00:00.320) 0:07:08.086 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:29:04 -0400 (0:00:00.342) 0:07:08.428 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:29:04 -0400 (0:00:00.306) 0:07:08.734 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:29:05 -0400 (0:00:00.328) 0:07:09.063 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:29:05 -0400 (0:00:00.395) 0:07:09.459 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:29:05 -0400 (0:00:00.275) 0:07:09.748 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:29:06 -0400 (0:00:00.345) 0:07:10.094 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:29:06 -0400 (0:00:00.378) 0:07:10.472 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:29:06 -0400 (0:00:00.546) 0:07:11.019 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:29:07 -0400 (0:00:00.270) 0:07:11.290 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:29:07 -0400 (0:00:00.347) 0:07:11.637 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:29:07 -0400 (0:00:00.254) 0:07:11.892 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:29:08 -0400 (0:00:00.252) 0:07:12.144 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:29:08 -0400 (0:00:00.311) 0:07:12.455 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:29:08 -0400 (0:00:00.229) 0:07:12.685 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:29:08 -0400 (0:00:00.240) 0:07:12.926 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:29:09 -0400 (0:00:00.352) 0:07:13.279 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:197 Wednesday 30 July 2025 11:29:09 -0400 (0:00:00.339) 0:07:13.619 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 30 July 2025 11:29:10 -0400 (0:00:00.890) 0:07:14.509 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 30 July 2025 11:29:11 -0400 (0:00:00.705) 0:07:15.214 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:29:11 -0400 (0:00:00.498) 0:07:15.713 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:29:12 -0400 (0:00:00.485) 0:07:16.198 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:29:12 -0400 (0:00:00.579) 0:07:16.778 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:29:13 -0400 (0:00:00.838) 0:07:17.617 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:29:14 -0400 (0:00:00.952) 0:07:18.570 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:29:14 -0400 (0:00:00.387) 0:07:18.957 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:29:15 -0400 (0:00:00.357) 0:07:19.314 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:29:15 -0400 (0:00:00.369) 0:07:19.684 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:29:16 -0400 (0:00:00.966) 0:07:20.650 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:29:18 -0400 (0:00:02.380) 0:07:23.031 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:29:19 -0400 (0:00:00.642) 0:07:23.673 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:29:20 -0400 (0:00:00.685) 0:07:24.359 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:29:22 -0400 (0:00:02.259) 0:07:26.618 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:29:23 -0400 (0:00:00.611) 0:07:27.229 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:29:23 -0400 (0:00:00.493) 0:07:27.723 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:29:24 -0400 (0:00:00.570) 0:07:28.293 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:29:24 -0400 (0:00:00.490) 0:07:28.784 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:29:27 -0400 (0:00:02.643) 0:07:31.427 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:29:30 -0400 (0:00:03.183) 0:07:34.611 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:29:31 -0400 (0:00:00.912) 0:07:35.524 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:29:31 -0400 (0:00:00.288) 0:07:35.812 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 30 July 2025 11:29:34 -0400 (0:00:02.623) 0:07:38.436 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:29:34 -0400 (0:00:00.500) 0:07:38.937 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 30 July 2025 11:29:35 -0400 (0:00:00.212) 0:07:39.149 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 30 July 2025 11:29:35 -0400 (0:00:00.291) 0:07:39.440 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 30 July 2025 11:29:35 -0400 (0:00:00.477) 0:07:39.918 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:216 Wednesday 30 July 2025 11:29:36 -0400 (0:00:00.317) 0:07:40.235 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:29:37 -0400 (0:00:01.616) 0:07:41.852 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:29:38 -0400 (0:00:00.509) 0:07:42.362 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:29:38 -0400 (0:00:00.572) 0:07:42.934 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:29:39 -0400 (0:00:00.810) 0:07:43.745 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:29:40 -0400 (0:00:00.340) 0:07:44.085 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:29:40 -0400 (0:00:00.331) 0:07:44.416 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:29:40 -0400 (0:00:00.342) 0:07:44.759 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:29:41 -0400 (0:00:00.355) 0:07:45.114 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:29:41 -0400 (0:00:00.733) 0:07:45.848 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:29:44 -0400 (0:00:02.405) 0:07:48.254 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:29:44 -0400 (0:00:00.677) 0:07:48.931 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:29:45 -0400 (0:00:00.678) 0:07:49.610 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:29:48 -0400 (0:00:02.433) 0:07:52.043 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:29:48 -0400 (0:00:00.563) 0:07:52.606 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:29:49 -0400 (0:00:00.554) 0:07:53.160 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:29:49 -0400 (0:00:00.532) 0:07:53.693 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:29:50 -0400 (0:00:00.570) 0:07:54.263 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:29:52 -0400 (0:00:02.514) 0:07:56.777 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:29:55 -0400 (0:00:02.953) 0:07:59.731 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:29:56 -0400 (0:00:00.886) 0:08:00.617 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:29:56 -0400 (0:00:00.272) 0:08:00.889 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-6b89efb3-c071-4e76-a8c4-14387183db86", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:30:08 -0400 (0:00:11.554) 0:08:12.444 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:30:08 -0400 (0:00:00.556) 0:08:13.001 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889291.6047635, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "99f359c113247be4fb01ef772d05115e57c51b7c", "ctime": 1753889291.6017635, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753889291.6017635, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:30:10 -0400 (0:00:01.192) 0:08:14.194 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:30:11 -0400 (0:00:01.442) 0:08:15.636 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:30:11 -0400 (0:00:00.193) 0:08:15.830 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-6b89efb3-c071-4e76-a8c4-14387183db86", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:30:12 -0400 (0:00:01.139) 0:08:16.969 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:30:13 -0400 (0:00:00.447) 0:08:17.417 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:30:13 -0400 (0:00:00.414) 0:08:17.831 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6b89efb3-c071-4e76-a8c4-14387183db86" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:30:15 -0400 (0:00:01.731) 0:08:19.563 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:30:17 -0400 (0:00:01.689) 0:08:21.253 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:30:19 -0400 (0:00:01.807) 0:08:23.060 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:30:19 -0400 (0:00:00.800) 0:08:23.861 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:30:21 -0400 (0:00:02.079) 0:08:25.940 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889303.77689, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1c02573c61feb19ca476574467b5fa9adcd7945d", "ctime": 1753889296.7048163, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 155189453, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1753889296.7064, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1661890624", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:30:23 -0400 (0:00:01.225) 0:08:27.166 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-6b89efb3-c071-4e76-a8c4-14387183db86', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-6b89efb3-c071-4e76-a8c4-14387183db86", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:30:25 -0400 (0:00:02.745) 0:08:29.912 ******** ok: [managed-node2] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:233 Wednesday 30 July 2025 11:30:28 -0400 (0:00:02.161) 0:08:32.073 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:30:28 -0400 (0:00:00.920) 0:08:32.994 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:30:29 -0400 (0:00:00.662) 0:08:33.657 ******** skipping: [managed-node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:30:30 -0400 (0:00:00.752) 0:08:34.409 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "size": "10G", "type": "crypt", "uuid": "d2ff81cf-95b5-4b32-9480-9cce54f51831" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "7f16eb98-ed03-46aa-bb0a-8abc8caee7da" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:30:31 -0400 (0:00:01.228) 0:08:35.637 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004039", "end": "2025-07-30 11:30:33.510306", "rc": 0, "start": "2025-07-30 11:30:32.506267" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:30:33 -0400 (0:00:02.139) 0:08:37.792 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002926", "end": "2025-07-30 11:30:34.783147", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:30:34.780221" } STDOUT: luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:30:34 -0400 (0:00:01.238) 0:08:39.030 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 30 July 2025 11:30:35 -0400 (0:00:00.861) 0:08:39.891 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 30 July 2025 11:30:36 -0400 (0:00:00.283) 0:08:40.174 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 30 July 2025 11:30:36 -0400 (0:00:00.339) 0:08:40.514 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 30 July 2025 11:30:36 -0400 (0:00:00.375) 0:08:40.889 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 => (item=members) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 30 July 2025 11:30:37 -0400 (0:00:00.673) 0:08:41.563 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 30 July 2025 11:30:37 -0400 (0:00:00.234) 0:08:41.798 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 30 July 2025 11:30:38 -0400 (0:00:00.238) 0:08:42.036 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 30 July 2025 11:30:38 -0400 (0:00:00.255) 0:08:42.292 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 30 July 2025 11:30:38 -0400 (0:00:00.332) 0:08:42.625 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 30 July 2025 11:30:38 -0400 (0:00:00.272) 0:08:42.897 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 30 July 2025 11:30:39 -0400 (0:00:00.264) 0:08:43.162 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 30 July 2025 11:30:39 -0400 (0:00:00.286) 0:08:43.448 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 30 July 2025 11:30:39 -0400 (0:00:00.283) 0:08:43.732 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 30 July 2025 11:30:39 -0400 (0:00:00.278) 0:08:44.011 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:87102): WARNING **: 11:30:40.954: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_8.7p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.246 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.246 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 30 July 2025 11:30:41 -0400 (0:00:01.305) 0:08:45.316 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 30 July 2025 11:30:41 -0400 (0:00:00.612) 0:08:45.928 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 30 July 2025 11:30:42 -0400 (0:00:00.731) 0:08:46.660 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 30 July 2025 11:30:42 -0400 (0:00:00.244) 0:08:46.904 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 30 July 2025 11:30:43 -0400 (0:00:00.283) 0:08:47.188 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 30 July 2025 11:30:43 -0400 (0:00:00.299) 0:08:47.487 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 30 July 2025 11:30:43 -0400 (0:00:00.275) 0:08:47.763 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 30 July 2025 11:30:44 -0400 (0:00:00.314) 0:08:48.077 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 30 July 2025 11:30:44 -0400 (0:00:00.335) 0:08:48.412 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 30 July 2025 11:30:44 -0400 (0:00:00.246) 0:08:48.658 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 30 July 2025 11:30:44 -0400 (0:00:00.230) 0:08:48.889 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 30 July 2025 11:30:45 -0400 (0:00:00.213) 0:08:49.102 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 30 July 2025 11:30:45 -0400 (0:00:00.279) 0:08:49.382 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 30 July 2025 11:30:45 -0400 (0:00:00.290) 0:08:49.672 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 30 July 2025 11:30:46 -0400 (0:00:00.743) 0:08:50.416 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 30 July 2025 11:30:46 -0400 (0:00:00.385) 0:08:50.802 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 30 July 2025 11:30:47 -0400 (0:00:00.654) 0:08:51.457 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 30 July 2025 11:30:47 -0400 (0:00:00.330) 0:08:51.787 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 30 July 2025 11:30:48 -0400 (0:00:00.792) 0:08:52.580 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 30 July 2025 11:30:50 -0400 (0:00:01.491) 0:08:54.071 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 30 July 2025 11:30:51 -0400 (0:00:01.129) 0:08:55.201 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 30 July 2025 11:30:51 -0400 (0:00:00.297) 0:08:55.498 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 30 July 2025 11:30:51 -0400 (0:00:00.338) 0:08:55.837 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 30 July 2025 11:30:52 -0400 (0:00:00.798) 0:08:56.635 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 30 July 2025 11:30:52 -0400 (0:00:00.319) 0:08:56.954 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 30 July 2025 11:30:53 -0400 (0:00:00.765) 0:08:57.720 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 30 July 2025 11:30:53 -0400 (0:00:00.274) 0:08:57.994 ******** skipping: [managed-node2] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 30 July 2025 11:30:54 -0400 (0:00:00.241) 0:08:58.236 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 30 July 2025 11:30:54 -0400 (0:00:00.336) 0:08:58.572 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 30 July 2025 11:30:54 -0400 (0:00:00.296) 0:08:58.869 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 30 July 2025 11:30:55 -0400 (0:00:00.303) 0:08:59.172 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 30 July 2025 11:30:55 -0400 (0:00:00.299) 0:08:59.472 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 30 July 2025 11:30:55 -0400 (0:00:00.357) 0:08:59.829 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 30 July 2025 11:30:56 -0400 (0:00:00.355) 0:09:00.184 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:30:56 -0400 (0:00:00.574) 0:09:00.758 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:30:57 -0400 (0:00:00.561) 0:09:01.320 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:30:59 -0400 (0:00:01.808) 0:09:03.128 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:30:59 -0400 (0:00:00.408) 0:09:03.537 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:31:00 -0400 (0:00:00.739) 0:09:04.276 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:31:00 -0400 (0:00:00.650) 0:09:04.927 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:31:01 -0400 (0:00:00.351) 0:09:05.278 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:31:01 -0400 (0:00:00.633) 0:09:05.912 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:31:02 -0400 (0:00:00.614) 0:09:06.526 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:31:03 -0400 (0:00:00.600) 0:09:07.126 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:31:03 -0400 (0:00:00.222) 0:09:07.348 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:31:03 -0400 (0:00:00.357) 0:09:07.706 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:31:03 -0400 (0:00:00.299) 0:09:08.006 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:31:04 -0400 (0:00:00.347) 0:09:08.353 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:31:05 -0400 (0:00:00.994) 0:09:09.348 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:31:05 -0400 (0:00:00.584) 0:09:09.933 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:31:06 -0400 (0:00:00.697) 0:09:10.631 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:31:07 -0400 (0:00:00.709) 0:09:11.341 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:31:08 -0400 (0:00:00.784) 0:09:12.126 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:31:08 -0400 (0:00:00.282) 0:09:12.408 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:31:09 -0400 (0:00:00.738) 0:09:13.147 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:31:09 -0400 (0:00:00.730) 0:09:13.878 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889407.8209715, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889407.8209715, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1252, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1753889407.8209715, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:31:11 -0400 (0:00:01.260) 0:09:15.139 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:31:11 -0400 (0:00:00.360) 0:09:15.499 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:31:11 -0400 (0:00:00.273) 0:09:15.773 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:31:12 -0400 (0:00:00.335) 0:09:16.108 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:31:12 -0400 (0:00:00.329) 0:09:16.438 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:31:12 -0400 (0:00:00.241) 0:09:16.679 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:31:13 -0400 (0:00:00.421) 0:09:17.101 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889408.0739741, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889408.0739741, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1285, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753889408.0739741, "nlink": 1, "path": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:31:14 -0400 (0:00:01.364) 0:09:18.466 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:31:17 -0400 (0:00:02.588) 0:09:21.054 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.007442", "end": "2025-07-30 11:31:18.098271", "rc": 0, "start": "2025-07-30 11:31:18.090829" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 7f16eb98-ed03-46aa-bb0a-8abc8caee7da Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 674120 Threads: 2 Salt: e0 89 24 63 e1 50 f5 da b1 44 15 6e 82 dd 14 3c 74 b0 27 be 20 5d 82 d4 ed bf 2a 86 a7 5f a9 38 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 129774 Salt: 04 c0 0d 60 74 d8 d9 00 66 b6 62 64 fd 90 e9 e1 a4 74 a6 c0 30 4a 7e bb b1 4b 9a 92 c9 11 39 27 Digest: 0a e3 ec b7 72 fd fb 3b d9 34 cb 40 0e c2 e0 3b 43 f6 37 ee df 79 f7 3f d2 74 e5 70 34 86 0a 6c TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:31:18 -0400 (0:00:01.301) 0:09:22.356 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:31:18 -0400 (0:00:00.653) 0:09:23.010 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:31:19 -0400 (0:00:00.778) 0:09:23.788 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:31:20 -0400 (0:00:00.408) 0:09:24.197 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:31:20 -0400 (0:00:00.404) 0:09:24.602 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:31:21 -0400 (0:00:00.513) 0:09:25.116 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:31:21 -0400 (0:00:00.347) 0:09:25.465 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:31:21 -0400 (0:00:00.368) 0:09:25.834 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:31:22 -0400 (0:00:00.835) 0:09:26.669 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:31:23 -0400 (0:00:00.641) 0:09:27.310 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:31:23 -0400 (0:00:00.689) 0:09:28.000 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:31:26 -0400 (0:00:02.049) 0:09:30.050 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:31:26 -0400 (0:00:00.764) 0:09:30.814 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:31:27 -0400 (0:00:00.365) 0:09:31.180 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:31:27 -0400 (0:00:00.250) 0:09:31.430 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:31:27 -0400 (0:00:00.283) 0:09:31.713 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:31:27 -0400 (0:00:00.301) 0:09:32.015 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:31:28 -0400 (0:00:00.291) 0:09:32.307 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:31:28 -0400 (0:00:00.217) 0:09:32.524 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:31:28 -0400 (0:00:00.251) 0:09:32.776 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:31:29 -0400 (0:00:00.262) 0:09:33.038 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:31:29 -0400 (0:00:00.300) 0:09:33.339 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:31:29 -0400 (0:00:00.179) 0:09:33.519 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:31:29 -0400 (0:00:00.359) 0:09:33.878 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:31:30 -0400 (0:00:00.647) 0:09:34.526 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:31:30 -0400 (0:00:00.492) 0:09:35.018 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:31:31 -0400 (0:00:00.623) 0:09:35.642 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:31:31 -0400 (0:00:00.352) 0:09:35.994 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:31:32 -0400 (0:00:00.637) 0:09:36.632 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:31:33 -0400 (0:00:00.651) 0:09:37.283 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:31:33 -0400 (0:00:00.623) 0:09:37.906 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:31:34 -0400 (0:00:00.608) 0:09:38.515 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:31:35 -0400 (0:00:00.622) 0:09:39.137 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:31:35 -0400 (0:00:00.762) 0:09:39.900 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:31:36 -0400 (0:00:00.835) 0:09:40.736 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:31:37 -0400 (0:00:00.686) 0:09:41.422 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:31:38 -0400 (0:00:00.638) 0:09:42.061 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:31:38 -0400 (0:00:00.706) 0:09:42.767 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:31:39 -0400 (0:00:00.670) 0:09:43.437 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:31:40 -0400 (0:00:00.603) 0:09:44.040 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:31:40 -0400 (0:00:00.682) 0:09:44.723 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:31:41 -0400 (0:00:00.702) 0:09:45.425 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:31:42 -0400 (0:00:00.655) 0:09:46.081 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:31:42 -0400 (0:00:00.615) 0:09:46.697 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:31:43 -0400 (0:00:00.684) 0:09:47.382 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:31:43 -0400 (0:00:00.609) 0:09:47.991 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:31:44 -0400 (0:00:00.730) 0:09:48.722 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:31:45 -0400 (0:00:00.801) 0:09:49.523 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:31:45 -0400 (0:00:00.395) 0:09:49.919 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:31:46 -0400 (0:00:00.369) 0:09:50.289 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:31:46 -0400 (0:00:00.571) 0:09:50.861 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:31:47 -0400 (0:00:00.252) 0:09:51.114 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:31:47 -0400 (0:00:00.268) 0:09:51.382 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:31:47 -0400 (0:00:00.374) 0:09:51.756 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:31:48 -0400 (0:00:00.329) 0:09:52.086 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:31:48 -0400 (0:00:00.351) 0:09:52.438 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:31:48 -0400 (0:00:00.299) 0:09:52.737 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:31:49 -0400 (0:00:00.319) 0:09:53.056 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:31:49 -0400 (0:00:00.353) 0:09:53.410 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:31:50 -0400 (0:00:00.745) 0:09:54.155 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 30 July 2025 11:31:50 -0400 (0:00:00.293) 0:09:54.448 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:239 Wednesday 30 July 2025 11:31:51 -0400 (0:00:01.220) 0:09:55.668 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 30 July 2025 11:31:52 -0400 (0:00:01.005) 0:09:56.674 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 30 July 2025 11:31:53 -0400 (0:00:00.780) 0:09:57.454 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:31:53 -0400 (0:00:00.516) 0:09:57.971 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:31:54 -0400 (0:00:00.538) 0:09:58.509 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:31:55 -0400 (0:00:00.762) 0:09:59.272 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:31:56 -0400 (0:00:00.869) 0:10:00.142 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:31:56 -0400 (0:00:00.322) 0:10:00.465 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:31:56 -0400 (0:00:00.391) 0:10:00.863 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:31:57 -0400 (0:00:00.305) 0:10:01.168 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:31:57 -0400 (0:00:00.346) 0:10:01.515 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:31:59 -0400 (0:00:01.621) 0:10:03.136 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:32:01 -0400 (0:00:02.472) 0:10:05.609 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:32:02 -0400 (0:00:00.613) 0:10:06.223 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:32:02 -0400 (0:00:00.663) 0:10:06.887 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:32:05 -0400 (0:00:02.632) 0:10:09.519 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:32:06 -0400 (0:00:00.667) 0:10:10.186 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:32:06 -0400 (0:00:00.626) 0:10:10.813 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:32:07 -0400 (0:00:00.553) 0:10:11.366 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:32:07 -0400 (0:00:00.650) 0:10:12.017 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:32:10 -0400 (0:00:02.728) 0:10:14.746 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service": { "name": "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:32:14 -0400 (0:00:04.041) 0:10:18.787 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:32:15 -0400 (0:00:00.778) 0:10:19.566 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d6b89efb3\x2dc071\x2d4e76\x2da8c4\x2d14387183db86.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "name": "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target systemd-udevd-kernel.socket dev-sda.device systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d6b89efb3\\\\x2dc071\\\\x2d4e76\\\\x2da8c4\\\\x2d14387183db86.target\"", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-6b89efb3-c071-4e76-a8c4-14387183db86", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-6b89efb3-c071-4e76-a8c4-14387183db86 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-6b89efb3-c071-4e76-a8c4-14387183db86 /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-6b89efb3-c071-4e76-a8c4-14387183db86 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-6b89efb3-c071-4e76-a8c4-14387183db86 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d6b89efb3\\\\x2dc071\\\\x2d4e76\\\\x2da8c4\\\\x2d14387183db86.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:30:21 EDT", "StateChangeTimestampMonotonic": "2327963412", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d6b89efb3\\\\x2dc071\\\\x2d4e76\\\\x2da8c4\\\\x2d14387183db86.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:32:17 -0400 (0:00:01.641) 0:10:21.207 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 30 July 2025 11:32:19 -0400 (0:00:02.440) 0:10:23.647 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:32:20 -0400 (0:00:00.465) 0:10:24.113 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d6b89efb3\x2dc071\x2d4e76\x2da8c4\x2d14387183db86.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "name": "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454152192", "LimitMEMLOCKSoft": "454152192", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d6b89efb3\\x2dc071\\x2d4e76\\x2da8c4\\x2d14387183db86.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d6b89efb3\\\\x2dc071\\\\x2d4e76\\\\x2da8c4\\\\x2d14387183db86.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 30 July 2025 11:32:21 -0400 (0:00:01.641) 0:10:25.754 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 30 July 2025 11:32:22 -0400 (0:00:00.400) 0:10:26.155 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 30 July 2025 11:32:22 -0400 (0:00:00.504) 0:10:26.660 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 30 July 2025 11:32:22 -0400 (0:00:00.319) 0:10:26.980 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889511.44905, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753889511.44905, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1753889511.44905, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3730033555", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 30 July 2025 11:32:24 -0400 (0:00:01.202) 0:10:28.183 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:263 Wednesday 30 July 2025 11:32:24 -0400 (0:00:00.306) 0:10:28.490 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:32:26 -0400 (0:00:01.547) 0:10:30.037 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:32:26 -0400 (0:00:00.569) 0:10:30.607 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:32:27 -0400 (0:00:00.697) 0:10:31.304 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:32:27 -0400 (0:00:00.710) 0:10:32.015 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:32:28 -0400 (0:00:00.379) 0:10:32.394 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:32:28 -0400 (0:00:00.348) 0:10:32.743 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:32:29 -0400 (0:00:00.304) 0:10:33.047 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:32:29 -0400 (0:00:00.260) 0:10:33.308 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:32:29 -0400 (0:00:00.709) 0:10:34.018 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:32:32 -0400 (0:00:02.534) 0:10:36.553 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:32:33 -0400 (0:00:00.594) 0:10:37.147 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:32:33 -0400 (0:00:00.689) 0:10:37.837 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:32:36 -0400 (0:00:02.603) 0:10:40.440 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:32:36 -0400 (0:00:00.531) 0:10:40.971 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:32:37 -0400 (0:00:00.552) 0:10:41.524 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:32:38 -0400 (0:00:00.603) 0:10:42.127 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:32:38 -0400 (0:00:00.609) 0:10:42.737 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:32:41 -0400 (0:00:02.556) 0:10:45.293 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service": { "name": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:32:44 -0400 (0:00:03.046) 0:10:48.340 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:32:44 -0400 (0:00:00.616) 0:10:48.958 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d7f16eb98\x2ded03\x2d46aa\x2dbb0a\x2d8abc8caee7da.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "name": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket dev-sda1.device cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.target\"", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:32:21 EDT", "StateChangeTimestampMonotonic": "2447783795", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:32:46 -0400 (0:00:01.663) 0:10:50.622 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:32:49 -0400 (0:00:03.078) 0:10:53.700 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:32:50 -0400 (0:00:00.485) 0:10:54.185 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889418.7840855, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "11cae6a79b63255078197cb9f74d2c32b8359f69", "ctime": 1753889418.7810855, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753889418.7810855, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:32:51 -0400 (0:00:01.174) 0:10:55.360 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:32:52 -0400 (0:00:01.276) 0:10:56.636 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d7f16eb98\x2ded03\x2d46aa\x2dbb0a\x2d8abc8caee7da.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "name": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454152192", "LimitMEMLOCKSoft": "454152192", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:32:21 EDT", "StateChangeTimestampMonotonic": "2447783795", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:32:54 -0400 (0:00:01.782) 0:10:58.419 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:32:55 -0400 (0:00:01.468) 0:10:59.887 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:32:56 -0400 (0:00:00.387) 0:11:00.274 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:32:56 -0400 (0:00:00.271) 0:11:00.546 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:32:58 -0400 (0:00:01.832) 0:11:02.379 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:33:00 -0400 (0:00:01.939) 0:11:04.318 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': 'UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:33:02 -0400 (0:00:01.875) 0:11:06.194 ******** skipping: [managed-node2] => (item={'src': 'UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:33:02 -0400 (0:00:00.789) 0:11:06.983 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:33:04 -0400 (0:00:02.046) 0:11:09.030 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889434.782252, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8fa1a20b19079e6d0a3558d63a85145332007641", "ctime": 1753889425.5911565, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 385876180, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1753889425.5917478, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "1130477796", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:33:06 -0400 (0:00:01.207) 0:11:10.237 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:33:07 -0400 (0:00:01.792) 0:11:12.030 ******** ok: [managed-node2] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Wednesday 30 July 2025 11:33:10 -0400 (0:00:02.241) 0:11:14.271 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:33:11 -0400 (0:00:01.030) 0:11:15.302 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:33:12 -0400 (0:00:00.856) 0:11:16.158 ******** skipping: [managed-node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:33:12 -0400 (0:00:00.702) 0:11:16.860 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "f8cbad2e-790e-46be-a94e-dac5c5a6a4d6" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:33:14 -0400 (0:00:01.188) 0:11:18.049 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003525", "end": "2025-07-30 11:33:15.119962", "rc": 0, "start": "2025-07-30 11:33:15.116437" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:33:15 -0400 (0:00:01.361) 0:11:19.410 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003440", "end": "2025-07-30 11:33:16.386934", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:33:16.383494" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:33:16 -0400 (0:00:01.247) 0:11:20.658 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 30 July 2025 11:33:17 -0400 (0:00:00.932) 0:11:21.591 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 30 July 2025 11:33:17 -0400 (0:00:00.345) 0:11:21.936 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 30 July 2025 11:33:18 -0400 (0:00:00.292) 0:11:22.228 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 30 July 2025 11:33:18 -0400 (0:00:00.272) 0:11:22.501 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 => (item=members) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 30 July 2025 11:33:19 -0400 (0:00:00.645) 0:11:23.146 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 30 July 2025 11:33:19 -0400 (0:00:00.284) 0:11:23.430 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 30 July 2025 11:33:19 -0400 (0:00:00.229) 0:11:23.659 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 30 July 2025 11:33:19 -0400 (0:00:00.256) 0:11:23.916 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 30 July 2025 11:33:20 -0400 (0:00:00.314) 0:11:24.230 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 30 July 2025 11:33:20 -0400 (0:00:00.264) 0:11:24.495 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 30 July 2025 11:33:20 -0400 (0:00:00.309) 0:11:24.804 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 30 July 2025 11:33:21 -0400 (0:00:00.344) 0:11:25.149 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 30 July 2025 11:33:21 -0400 (0:00:00.307) 0:11:25.456 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 30 July 2025 11:33:21 -0400 (0:00:00.268) 0:11:25.725 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:92686): WARNING **: 11:33:22.641: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_8.7p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.246 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.246 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 30 July 2025 11:33:23 -0400 (0:00:01.315) 0:11:27.040 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 30 July 2025 11:33:23 -0400 (0:00:00.409) 0:11:27.450 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 30 July 2025 11:33:24 -0400 (0:00:00.620) 0:11:28.071 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 30 July 2025 11:33:24 -0400 (0:00:00.294) 0:11:28.365 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 30 July 2025 11:33:24 -0400 (0:00:00.281) 0:11:28.646 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 30 July 2025 11:33:24 -0400 (0:00:00.255) 0:11:28.902 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 30 July 2025 11:33:25 -0400 (0:00:00.296) 0:11:29.198 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 30 July 2025 11:33:25 -0400 (0:00:00.304) 0:11:29.502 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 30 July 2025 11:33:25 -0400 (0:00:00.274) 0:11:29.777 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 30 July 2025 11:33:25 -0400 (0:00:00.244) 0:11:30.022 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 30 July 2025 11:33:26 -0400 (0:00:00.300) 0:11:30.342 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 30 July 2025 11:33:26 -0400 (0:00:00.287) 0:11:30.630 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 30 July 2025 11:33:26 -0400 (0:00:00.300) 0:11:30.931 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 30 July 2025 11:33:27 -0400 (0:00:00.362) 0:11:31.293 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 30 July 2025 11:33:27 -0400 (0:00:00.625) 0:11:31.919 ******** skipping: [managed-node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 30 July 2025 11:33:28 -0400 (0:00:00.280) 0:11:32.199 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 30 July 2025 11:33:28 -0400 (0:00:00.570) 0:11:32.770 ******** skipping: [managed-node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 30 July 2025 11:33:29 -0400 (0:00:00.365) 0:11:33.136 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 30 July 2025 11:33:29 -0400 (0:00:00.798) 0:11:33.935 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 30 July 2025 11:33:30 -0400 (0:00:00.562) 0:11:34.497 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 30 July 2025 11:33:30 -0400 (0:00:00.291) 0:11:34.789 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 30 July 2025 11:33:31 -0400 (0:00:01.151) 0:11:35.940 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 30 July 2025 11:33:32 -0400 (0:00:00.324) 0:11:36.265 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 30 July 2025 11:33:33 -0400 (0:00:00.829) 0:11:37.094 ******** skipping: [managed-node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 30 July 2025 11:33:33 -0400 (0:00:00.337) 0:11:37.431 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 30 July 2025 11:33:34 -0400 (0:00:00.694) 0:11:38.126 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 30 July 2025 11:33:34 -0400 (0:00:00.247) 0:11:38.373 ******** skipping: [managed-node2] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 30 July 2025 11:33:34 -0400 (0:00:00.279) 0:11:38.652 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 30 July 2025 11:33:34 -0400 (0:00:00.258) 0:11:38.911 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 30 July 2025 11:33:35 -0400 (0:00:00.241) 0:11:39.153 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 30 July 2025 11:33:35 -0400 (0:00:00.276) 0:11:39.430 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 30 July 2025 11:33:35 -0400 (0:00:00.314) 0:11:39.744 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 30 July 2025 11:33:36 -0400 (0:00:00.332) 0:11:40.077 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 30 July 2025 11:33:36 -0400 (0:00:00.294) 0:11:40.371 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:33:36 -0400 (0:00:00.501) 0:11:40.873 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:33:37 -0400 (0:00:00.762) 0:11:41.636 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:33:39 -0400 (0:00:01.754) 0:11:43.390 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:33:39 -0400 (0:00:00.383) 0:11:43.774 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:33:40 -0400 (0:00:00.745) 0:11:44.520 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:33:41 -0400 (0:00:00.813) 0:11:45.334 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:33:41 -0400 (0:00:00.370) 0:11:45.704 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:33:42 -0400 (0:00:00.773) 0:11:46.489 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:33:43 -0400 (0:00:00.696) 0:11:47.185 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:33:43 -0400 (0:00:00.715) 0:11:47.900 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:33:44 -0400 (0:00:00.287) 0:11:48.187 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:33:44 -0400 (0:00:00.320) 0:11:48.508 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:33:44 -0400 (0:00:00.267) 0:11:48.776 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:33:45 -0400 (0:00:00.325) 0:11:49.101 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:33:46 -0400 (0:00:01.257) 0:11:50.358 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:33:47 -0400 (0:00:00.841) 0:11:51.200 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:33:47 -0400 (0:00:00.760) 0:11:51.960 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:33:48 -0400 (0:00:00.629) 0:11:52.589 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:33:49 -0400 (0:00:00.642) 0:11:53.232 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:33:49 -0400 (0:00:00.312) 0:11:53.544 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:33:50 -0400 (0:00:00.908) 0:11:54.453 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:33:51 -0400 (0:00:00.674) 0:11:55.127 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889569.3256521, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889569.3256521, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1485, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1753889569.3256521, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:33:52 -0400 (0:00:01.210) 0:11:56.338 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:33:52 -0400 (0:00:00.357) 0:11:56.695 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:33:52 -0400 (0:00:00.225) 0:11:56.921 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:33:53 -0400 (0:00:00.343) 0:11:57.264 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:33:53 -0400 (0:00:00.364) 0:11:57.628 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:33:53 -0400 (0:00:00.263) 0:11:57.891 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:33:54 -0400 (0:00:00.384) 0:11:58.276 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:33:54 -0400 (0:00:00.265) 0:11:58.541 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:33:57 -0400 (0:00:02.558) 0:12:01.099 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:33:57 -0400 (0:00:00.307) 0:12:01.410 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:33:57 -0400 (0:00:00.300) 0:12:01.711 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:33:58 -0400 (0:00:00.783) 0:12:02.494 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:33:58 -0400 (0:00:00.240) 0:12:02.734 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:33:58 -0400 (0:00:00.234) 0:12:02.969 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:33:59 -0400 (0:00:00.293) 0:12:03.262 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:33:59 -0400 (0:00:00.251) 0:12:03.514 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:33:59 -0400 (0:00:00.327) 0:12:03.841 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:34:00 -0400 (0:00:00.710) 0:12:04.551 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:34:01 -0400 (0:00:00.655) 0:12:05.207 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:34:01 -0400 (0:00:00.580) 0:12:05.788 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:34:02 -0400 (0:00:00.594) 0:12:06.383 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:34:03 -0400 (0:00:00.662) 0:12:07.045 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:34:03 -0400 (0:00:00.367) 0:12:07.412 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:34:03 -0400 (0:00:00.283) 0:12:07.696 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:34:03 -0400 (0:00:00.276) 0:12:07.973 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:34:04 -0400 (0:00:00.274) 0:12:08.248 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:34:04 -0400 (0:00:00.312) 0:12:08.560 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:34:04 -0400 (0:00:00.268) 0:12:08.829 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:34:05 -0400 (0:00:01.093) 0:12:09.922 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:34:06 -0400 (0:00:00.261) 0:12:10.190 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:34:06 -0400 (0:00:00.256) 0:12:10.446 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:34:06 -0400 (0:00:00.263) 0:12:10.709 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:34:06 -0400 (0:00:00.259) 0:12:10.968 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:34:07 -0400 (0:00:00.616) 0:12:11.584 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:34:08 -0400 (0:00:00.724) 0:12:12.309 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:34:08 -0400 (0:00:00.532) 0:12:12.841 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:34:09 -0400 (0:00:00.361) 0:12:13.203 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:34:09 -0400 (0:00:00.645) 0:12:13.848 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:34:10 -0400 (0:00:00.508) 0:12:14.357 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:34:10 -0400 (0:00:00.528) 0:12:14.900 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:34:11 -0400 (0:00:00.651) 0:12:15.552 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:34:12 -0400 (0:00:00.567) 0:12:16.119 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:34:12 -0400 (0:00:00.479) 0:12:16.613 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:34:13 -0400 (0:00:00.593) 0:12:17.207 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:34:13 -0400 (0:00:00.550) 0:12:17.757 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:34:14 -0400 (0:00:00.795) 0:12:18.553 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:34:15 -0400 (0:00:00.721) 0:12:19.274 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:34:15 -0400 (0:00:00.633) 0:12:19.907 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:34:16 -0400 (0:00:00.659) 0:12:20.567 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:34:17 -0400 (0:00:00.653) 0:12:21.220 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:34:17 -0400 (0:00:00.502) 0:12:21.723 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:34:18 -0400 (0:00:00.665) 0:12:22.389 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:34:18 -0400 (0:00:00.574) 0:12:22.963 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:34:19 -0400 (0:00:00.579) 0:12:23.543 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:34:19 -0400 (0:00:00.442) 0:12:23.985 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:34:20 -0400 (0:00:00.496) 0:12:24.482 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:34:20 -0400 (0:00:00.533) 0:12:25.016 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:34:21 -0400 (0:00:00.206) 0:12:25.222 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:34:21 -0400 (0:00:00.247) 0:12:25.470 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:34:21 -0400 (0:00:00.481) 0:12:25.951 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:34:22 -0400 (0:00:00.153) 0:12:26.107 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:34:22 -0400 (0:00:00.246) 0:12:26.354 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:34:22 -0400 (0:00:00.303) 0:12:26.658 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:34:22 -0400 (0:00:00.228) 0:12:26.886 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:34:23 -0400 (0:00:00.300) 0:12:27.187 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:34:23 -0400 (0:00:00.253) 0:12:27.440 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:34:23 -0400 (0:00:00.248) 0:12:27.689 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:34:23 -0400 (0:00:00.264) 0:12:27.953 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:34:24 -0400 (0:00:00.362) 0:12:28.316 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 30 July 2025 11:34:24 -0400 (0:00:00.264) 0:12:28.581 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:286 Wednesday 30 July 2025 11:34:25 -0400 (0:00:01.182) 0:12:29.764 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 30 July 2025 11:34:26 -0400 (0:00:01.041) 0:12:30.805 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 30 July 2025 11:34:27 -0400 (0:00:00.593) 0:12:31.398 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:34:27 -0400 (0:00:00.545) 0:12:31.944 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:34:28 -0400 (0:00:00.375) 0:12:32.320 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:34:28 -0400 (0:00:00.557) 0:12:32.878 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:34:29 -0400 (0:00:00.881) 0:12:33.759 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:34:30 -0400 (0:00:00.340) 0:12:34.099 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:34:30 -0400 (0:00:00.285) 0:12:34.385 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:34:30 -0400 (0:00:00.259) 0:12:34.644 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:34:30 -0400 (0:00:00.246) 0:12:34.920 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:34:31 -0400 (0:00:00.805) 0:12:35.726 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:34:34 -0400 (0:00:02.502) 0:12:38.228 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:34:34 -0400 (0:00:00.592) 0:12:38.821 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:34:35 -0400 (0:00:00.578) 0:12:39.401 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:34:38 -0400 (0:00:03.483) 0:12:42.885 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:34:40 -0400 (0:00:01.399) 0:12:44.284 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:34:40 -0400 (0:00:00.450) 0:12:44.735 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:34:41 -0400 (0:00:00.515) 0:12:45.250 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:34:41 -0400 (0:00:00.529) 0:12:45.779 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:34:44 -0400 (0:00:02.711) 0:12:48.491 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service": { "name": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:34:47 -0400 (0:00:03.004) 0:12:51.496 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:34:48 -0400 (0:00:00.940) 0:12:52.436 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d7f16eb98\x2ded03\x2d46aa\x2dbb0a\x2d8abc8caee7da.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "name": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-udevd-kernel.socket systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" dev-sda1.device", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.target\" umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:32:21 EDT", "StateChangeTimestampMonotonic": "2447783795", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:34:50 -0400 (0:00:01.711) 0:12:54.148 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 30 July 2025 11:34:52 -0400 (0:00:02.623) 0:12:56.771 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:34:53 -0400 (0:00:00.443) 0:12:57.214 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d7f16eb98\x2ded03\x2d46aa\x2dbb0a\x2d8abc8caee7da.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "name": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454152192", "LimitMEMLOCKSoft": "454152192", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d7f16eb98\\x2ded03\\x2d46aa\\x2dbb0a\\x2d8abc8caee7da.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d7f16eb98\\\\x2ded03\\\\x2d46aa\\\\x2dbb0a\\\\x2d8abc8caee7da.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 30 July 2025 11:34:54 -0400 (0:00:01.740) 0:12:58.955 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 30 July 2025 11:34:55 -0400 (0:00:00.375) 0:12:59.330 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 30 July 2025 11:34:55 -0400 (0:00:00.580) 0:12:59.911 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 30 July 2025 11:34:56 -0400 (0:00:00.389) 0:13:00.300 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889665.4636526, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753889665.4636526, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1753889665.4636526, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2687871447", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 30 July 2025 11:34:57 -0400 (0:00:01.196) 0:13:01.497 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:312 Wednesday 30 July 2025 11:34:57 -0400 (0:00:00.352) 0:13:01.849 ******** ok: [managed-node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testcljrsrmzlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:319 Wednesday 30 July 2025 11:35:00 -0400 (0:00:02.818) 0:13:04.667 ******** ok: [managed-node2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testcljrsrmzlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1753889700.9739943-125134-64680525624283/.source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:326 Wednesday 30 July 2025 11:35:04 -0400 (0:00:03.790) 0:13:08.458 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:35:04 -0400 (0:00:00.534) 0:13:08.993 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:35:05 -0400 (0:00:00.519) 0:13:09.512 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:35:06 -0400 (0:00:00.702) 0:13:10.215 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:35:06 -0400 (0:00:00.793) 0:13:11.009 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:35:07 -0400 (0:00:00.340) 0:13:11.349 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:35:07 -0400 (0:00:00.448) 0:13:11.797 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:35:07 -0400 (0:00:00.205) 0:13:12.003 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:35:08 -0400 (0:00:00.272) 0:13:12.276 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:35:08 -0400 (0:00:00.686) 0:13:12.962 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:35:11 -0400 (0:00:02.604) 0:13:15.567 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testcljrsrmzlukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:35:12 -0400 (0:00:00.649) 0:13:16.217 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:35:12 -0400 (0:00:00.686) 0:13:16.903 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:35:15 -0400 (0:00:02.546) 0:13:19.450 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:35:16 -0400 (0:00:00.783) 0:13:20.233 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:35:16 -0400 (0:00:00.637) 0:13:20.871 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:35:17 -0400 (0:00:00.565) 0:13:21.436 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:35:17 -0400 (0:00:00.555) 0:13:21.992 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:35:20 -0400 (0:00:02.393) 0:13:24.386 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:35:23 -0400 (0:00:02.886) 0:13:27.272 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:35:24 -0400 (0:00:00.974) 0:13:28.247 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:35:24 -0400 (0:00:00.189) 0:13:28.437 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "password": "/tmp/storage_testcljrsrmzlukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testcljrsrmzlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:35:35 -0400 (0:00:11.003) 0:13:39.440 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:35:36 -0400 (0:00:00.603) 0:13:40.044 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889581.910783, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8d032274c379c0fc4520c7a4ea1e2ac7b3fda498", "ctime": 1753889581.907783, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753889581.907783, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1436, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:35:37 -0400 (0:00:01.396) 0:13:41.440 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:35:38 -0400 (0:00:01.320) 0:13:42.761 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:35:38 -0400 (0:00:00.235) 0:13:42.996 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "password": "/tmp/storage_testcljrsrmzlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testcljrsrmzlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:35:39 -0400 (0:00:00.412) 0:13:43.409 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testcljrsrmzlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:35:39 -0400 (0:00:00.379) 0:13:43.788 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:35:40 -0400 (0:00:00.321) 0:13:44.110 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': 'UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f8cbad2e-790e-46be-a94e-dac5c5a6a4d6" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:35:41 -0400 (0:00:01.862) 0:13:45.972 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:35:43 -0400 (0:00:02.020) 0:13:47.992 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:35:45 -0400 (0:00:01.779) 0:13:49.772 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:35:46 -0400 (0:00:00.887) 0:13:50.659 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:35:48 -0400 (0:00:01.937) 0:13:52.596 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889596.3859339, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753889587.7338438, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 662700401, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1753889587.7352707, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1871493453", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:35:49 -0400 (0:00:01.181) 0:13:53.777 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', 'password': '/tmp/storage_testcljrsrmzlukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "password": "/tmp/storage_testcljrsrmzlukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:35:52 -0400 (0:00:02.678) 0:13:56.456 ******** ok: [managed-node2] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:343 Wednesday 30 July 2025 11:35:56 -0400 (0:00:03.722) 0:14:00.179 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:35:56 -0400 (0:00:00.403) 0:14:00.583 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testcljrsrmzlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:35:57 -0400 (0:00:00.596) 0:14:01.180 ******** skipping: [managed-node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:35:57 -0400 (0:00:00.618) 0:14:01.798 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "size": "10G", "type": "crypt", "uuid": "1856623a-4799-4c92-9bd2-450965e4fd48" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:35:58 -0400 (0:00:01.153) 0:14:02.976 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002910", "end": "2025-07-30 11:35:59.890721", "rc": 0, "start": "2025-07-30 11:35:59.887811" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:36:00 -0400 (0:00:01.165) 0:14:04.142 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003531", "end": "2025-07-30 11:36:01.081341", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:36:01.077810" } STDOUT: luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81 /dev/sda1 /tmp/storage_testcljrsrmzlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:36:01 -0400 (0:00:01.218) 0:14:05.360 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testcljrsrmzlukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 30 July 2025 11:36:02 -0400 (0:00:00.953) 0:14:06.314 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 30 July 2025 11:36:02 -0400 (0:00:00.288) 0:14:06.602 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 30 July 2025 11:36:02 -0400 (0:00:00.269) 0:14:06.872 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 30 July 2025 11:36:03 -0400 (0:00:00.301) 0:14:07.173 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 => (item=members) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 30 July 2025 11:36:03 -0400 (0:00:00.744) 0:14:07.917 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 30 July 2025 11:36:04 -0400 (0:00:00.315) 0:14:08.233 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 30 July 2025 11:36:04 -0400 (0:00:00.265) 0:14:08.498 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 30 July 2025 11:36:04 -0400 (0:00:00.282) 0:14:08.781 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 30 July 2025 11:36:05 -0400 (0:00:00.285) 0:14:09.067 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 30 July 2025 11:36:05 -0400 (0:00:00.301) 0:14:09.368 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 30 July 2025 11:36:05 -0400 (0:00:00.283) 0:14:09.651 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 30 July 2025 11:36:05 -0400 (0:00:00.295) 0:14:09.947 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 30 July 2025 11:36:06 -0400 (0:00:00.286) 0:14:10.234 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 30 July 2025 11:36:06 -0400 (0:00:00.259) 0:14:10.494 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:98231): WARNING **: 11:36:07.336: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_8.7p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.246 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.246 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 30 July 2025 11:36:07 -0400 (0:00:01.233) 0:14:11.727 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 30 July 2025 11:36:08 -0400 (0:00:00.590) 0:14:12.318 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 30 July 2025 11:36:09 -0400 (0:00:00.743) 0:14:13.061 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 30 July 2025 11:36:09 -0400 (0:00:00.229) 0:14:13.291 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 30 July 2025 11:36:09 -0400 (0:00:00.249) 0:14:13.541 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 30 July 2025 11:36:09 -0400 (0:00:00.338) 0:14:13.879 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 30 July 2025 11:36:10 -0400 (0:00:00.331) 0:14:14.211 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 30 July 2025 11:36:10 -0400 (0:00:00.321) 0:14:14.532 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 30 July 2025 11:36:10 -0400 (0:00:00.230) 0:14:14.763 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 30 July 2025 11:36:10 -0400 (0:00:00.235) 0:14:14.998 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 30 July 2025 11:36:11 -0400 (0:00:00.266) 0:14:15.264 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 30 July 2025 11:36:11 -0400 (0:00:00.286) 0:14:15.551 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 30 July 2025 11:36:11 -0400 (0:00:00.375) 0:14:15.926 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 30 July 2025 11:36:12 -0400 (0:00:00.304) 0:14:16.230 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 30 July 2025 11:36:12 -0400 (0:00:00.686) 0:14:16.917 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testcljrsrmzlukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testcljrsrmzlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 30 July 2025 11:36:13 -0400 (0:00:00.374) 0:14:17.292 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 30 July 2025 11:36:13 -0400 (0:00:00.634) 0:14:17.926 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testcljrsrmzlukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testcljrsrmzlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 30 July 2025 11:36:14 -0400 (0:00:00.406) 0:14:18.332 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 30 July 2025 11:36:14 -0400 (0:00:00.671) 0:14:19.004 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 30 July 2025 11:36:15 -0400 (0:00:00.718) 0:14:19.723 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 30 July 2025 11:36:15 -0400 (0:00:00.200) 0:14:19.923 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 30 July 2025 11:36:16 -0400 (0:00:00.196) 0:14:20.120 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 30 July 2025 11:36:16 -0400 (0:00:00.306) 0:14:20.427 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 30 July 2025 11:36:17 -0400 (0:00:00.762) 0:14:21.189 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testcljrsrmzlukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testcljrsrmzlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 30 July 2025 11:36:17 -0400 (0:00:00.369) 0:14:21.559 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 30 July 2025 11:36:18 -0400 (0:00:00.808) 0:14:22.367 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 30 July 2025 11:36:18 -0400 (0:00:00.273) 0:14:22.641 ******** skipping: [managed-node2] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 30 July 2025 11:36:18 -0400 (0:00:00.223) 0:14:22.865 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 30 July 2025 11:36:19 -0400 (0:00:00.188) 0:14:23.054 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 30 July 2025 11:36:19 -0400 (0:00:00.199) 0:14:23.253 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 30 July 2025 11:36:19 -0400 (0:00:00.277) 0:14:23.531 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 30 July 2025 11:36:19 -0400 (0:00:00.190) 0:14:23.721 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 30 July 2025 11:36:19 -0400 (0:00:00.255) 0:14:23.977 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 30 July 2025 11:36:20 -0400 (0:00:00.334) 0:14:24.312 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testcljrsrmzlukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:36:21 -0400 (0:00:01.491) 0:14:25.804 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:36:22 -0400 (0:00:00.753) 0:14:26.557 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:36:24 -0400 (0:00:01.547) 0:14:28.105 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:36:24 -0400 (0:00:00.277) 0:14:28.382 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:36:24 -0400 (0:00:00.563) 0:14:28.946 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:36:25 -0400 (0:00:00.880) 0:14:29.827 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:36:26 -0400 (0:00:00.352) 0:14:30.179 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:36:26 -0400 (0:00:00.709) 0:14:30.889 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:36:27 -0400 (0:00:00.624) 0:14:31.513 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:36:28 -0400 (0:00:00.645) 0:14:32.159 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:36:28 -0400 (0:00:00.319) 0:14:32.478 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:36:28 -0400 (0:00:00.345) 0:14:32.824 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:36:29 -0400 (0:00:00.296) 0:14:33.120 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:36:29 -0400 (0:00:00.303) 0:14:33.423 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:36:30 -0400 (0:00:00.929) 0:14:34.353 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:36:31 -0400 (0:00:00.813) 0:14:35.167 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:36:31 -0400 (0:00:00.700) 0:14:35.867 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:36:32 -0400 (0:00:00.586) 0:14:36.454 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:36:33 -0400 (0:00:00.720) 0:14:37.174 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:36:33 -0400 (0:00:00.361) 0:14:37.536 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:36:34 -0400 (0:00:00.782) 0:14:38.318 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:36:34 -0400 (0:00:00.704) 0:14:39.023 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889734.8103745, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889734.8103745, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1485, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1753889734.8103745, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:36:36 -0400 (0:00:01.193) 0:14:40.217 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:36:36 -0400 (0:00:00.443) 0:14:40.660 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:36:36 -0400 (0:00:00.277) 0:14:40.938 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:36:37 -0400 (0:00:00.404) 0:14:41.342 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:36:37 -0400 (0:00:00.317) 0:14:41.659 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:36:37 -0400 (0:00:00.290) 0:14:41.961 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:36:38 -0400 (0:00:00.415) 0:14:42.376 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889735.0563772, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889735.0563772, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1653, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753889735.0563772, "nlink": 1, "path": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:36:39 -0400 (0:00:01.324) 0:14:43.701 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:36:42 -0400 (0:00:02.606) 0:14:46.307 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.007440", "end": "2025-07-30 11:36:43.214444", "rc": 0, "start": "2025-07-30 11:36:43.207004" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 679128 Threads: 2 Salt: 1c a7 67 3b 38 60 1e 7e c1 ed 99 a7 dc 89 80 6f ee 90 ea d6 98 d9 bf fb 98 77 2f 60 18 63 32 82 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 129517 Salt: fd b9 d8 b9 92 29 40 5b 6e b5 2e 28 5f 60 62 b5 f9 75 9b 1b a0 3f 4a 14 cd 29 2c 38 14 e0 af a2 Digest: be d2 a9 90 0b a0 8c 50 5d 8a 1a 57 84 f1 a6 55 7a 2d 7c 09 1f fa 47 42 dd af be 8c 4e c8 6a 73 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:36:43 -0400 (0:00:01.206) 0:14:47.514 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:36:44 -0400 (0:00:00.704) 0:14:48.218 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:36:44 -0400 (0:00:00.637) 0:14:48.856 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:36:45 -0400 (0:00:00.409) 0:14:49.266 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:36:45 -0400 (0:00:00.346) 0:14:49.612 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:36:45 -0400 (0:00:00.396) 0:14:50.014 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:36:46 -0400 (0:00:00.352) 0:14:50.366 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:36:46 -0400 (0:00:00.424) 0:14:50.791 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81 /dev/sda1 /tmp/storage_testcljrsrmzlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testcljrsrmzlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:36:47 -0400 (0:00:00.711) 0:14:51.502 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:36:48 -0400 (0:00:00.577) 0:14:52.079 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:36:48 -0400 (0:00:00.849) 0:14:52.929 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:36:49 -0400 (0:00:00.621) 0:14:53.550 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:36:50 -0400 (0:00:00.642) 0:14:54.193 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:36:50 -0400 (0:00:00.333) 0:14:54.527 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:36:50 -0400 (0:00:00.267) 0:14:54.794 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:36:51 -0400 (0:00:00.263) 0:14:55.057 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:36:51 -0400 (0:00:00.298) 0:14:55.356 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:36:51 -0400 (0:00:00.264) 0:14:55.620 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:36:51 -0400 (0:00:00.218) 0:14:55.839 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:36:52 -0400 (0:00:00.260) 0:14:56.099 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:36:52 -0400 (0:00:00.276) 0:14:56.376 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:36:52 -0400 (0:00:00.250) 0:14:56.626 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:36:52 -0400 (0:00:00.286) 0:14:56.912 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:36:53 -0400 (0:00:00.326) 0:14:57.239 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:36:53 -0400 (0:00:00.528) 0:14:57.767 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:36:54 -0400 (0:00:00.504) 0:14:58.279 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:36:54 -0400 (0:00:00.503) 0:14:58.783 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:36:55 -0400 (0:00:00.341) 0:14:59.125 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:36:55 -0400 (0:00:00.743) 0:14:59.869 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:36:56 -0400 (0:00:00.634) 0:15:00.503 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:36:57 -0400 (0:00:00.704) 0:15:01.208 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:36:57 -0400 (0:00:00.539) 0:15:01.747 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:36:58 -0400 (0:00:00.607) 0:15:02.355 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:36:59 -0400 (0:00:00.713) 0:15:03.069 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:37:00 -0400 (0:00:01.543) 0:15:04.612 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:37:02 -0400 (0:00:02.361) 0:15:06.974 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:37:03 -0400 (0:00:00.612) 0:15:07.587 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:37:04 -0400 (0:00:00.532) 0:15:08.119 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:37:04 -0400 (0:00:00.663) 0:15:08.782 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:37:05 -0400 (0:00:00.583) 0:15:09.366 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:37:05 -0400 (0:00:00.613) 0:15:09.980 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:37:06 -0400 (0:00:00.604) 0:15:10.585 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:37:07 -0400 (0:00:00.563) 0:15:11.149 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:37:07 -0400 (0:00:00.626) 0:15:11.775 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:37:08 -0400 (0:00:00.603) 0:15:12.379 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:37:09 -0400 (0:00:00.678) 0:15:13.058 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:37:09 -0400 (0:00:00.738) 0:15:13.796 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:37:10 -0400 (0:00:00.702) 0:15:14.499 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:37:10 -0400 (0:00:00.409) 0:15:14.911 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:37:11 -0400 (0:00:00.346) 0:15:15.257 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:37:11 -0400 (0:00:00.541) 0:15:15.799 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:37:11 -0400 (0:00:00.233) 0:15:16.032 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:37:12 -0400 (0:00:00.256) 0:15:16.289 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:37:12 -0400 (0:00:00.336) 0:15:16.625 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:37:12 -0400 (0:00:00.301) 0:15:16.927 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:37:13 -0400 (0:00:00.330) 0:15:17.258 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:37:13 -0400 (0:00:00.286) 0:15:17.544 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:37:13 -0400 (0:00:00.256) 0:15:17.801 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:37:14 -0400 (0:00:00.286) 0:15:18.088 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:37:14 -0400 (0:00:00.668) 0:15:18.757 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:349 Wednesday 30 July 2025 11:37:14 -0400 (0:00:00.262) 0:15:19.019 ******** ok: [managed-node2] => { "changed": false, "path": "/tmp/storage_testcljrsrmzlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:359 Wednesday 30 July 2025 11:37:16 -0400 (0:00:01.334) 0:15:20.354 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 30 July 2025 11:37:16 -0400 (0:00:00.417) 0:15:20.771 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 30 July 2025 11:37:17 -0400 (0:00:00.716) 0:15:21.487 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:37:18 -0400 (0:00:00.612) 0:15:22.099 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:37:18 -0400 (0:00:00.511) 0:15:22.611 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:37:19 -0400 (0:00:00.564) 0:15:23.175 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:37:20 -0400 (0:00:00.869) 0:15:24.044 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:37:20 -0400 (0:00:00.430) 0:15:24.474 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:37:20 -0400 (0:00:00.304) 0:15:24.778 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:37:21 -0400 (0:00:00.327) 0:15:25.106 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:37:21 -0400 (0:00:00.261) 0:15:25.368 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:37:22 -0400 (0:00:00.741) 0:15:26.110 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:37:24 -0400 (0:00:02.709) 0:15:28.819 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:37:25 -0400 (0:00:00.664) 0:15:29.483 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:37:26 -0400 (0:00:00.671) 0:15:30.155 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:37:28 -0400 (0:00:02.698) 0:15:32.853 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:37:29 -0400 (0:00:00.718) 0:15:33.571 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:37:30 -0400 (0:00:00.556) 0:15:34.128 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:37:30 -0400 (0:00:00.553) 0:15:34.681 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:37:31 -0400 (0:00:00.472) 0:15:35.154 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:37:33 -0400 (0:00:02.690) 0:15:37.844 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:37:36 -0400 (0:00:02.979) 0:15:40.824 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:37:37 -0400 (0:00:00.770) 0:15:41.594 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:37:37 -0400 (0:00:00.299) 0:15:41.893 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 30 July 2025 11:37:40 -0400 (0:00:02.588) 0:15:44.482 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:37:40 -0400 (0:00:00.348) 0:15:44.830 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 30 July 2025 11:37:41 -0400 (0:00:00.255) 0:15:45.086 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 30 July 2025 11:37:41 -0400 (0:00:00.348) 0:15:45.435 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 30 July 2025 11:37:41 -0400 (0:00:00.523) 0:15:45.958 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:377 Wednesday 30 July 2025 11:37:42 -0400 (0:00:00.353) 0:15:46.312 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:37:43 -0400 (0:00:00.844) 0:15:47.156 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:37:44 -0400 (0:00:01.483) 0:15:48.640 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:37:45 -0400 (0:00:00.728) 0:15:49.368 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:37:46 -0400 (0:00:00.934) 0:15:50.302 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:37:46 -0400 (0:00:00.362) 0:15:50.665 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:37:47 -0400 (0:00:00.380) 0:15:51.046 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:37:47 -0400 (0:00:00.272) 0:15:51.319 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:37:47 -0400 (0:00:00.374) 0:15:51.694 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:37:48 -0400 (0:00:00.723) 0:15:52.417 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:37:50 -0400 (0:00:02.396) 0:15:54.814 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:37:51 -0400 (0:00:00.813) 0:15:55.628 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:37:52 -0400 (0:00:00.779) 0:15:56.408 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:37:55 -0400 (0:00:02.641) 0:15:59.050 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:37:55 -0400 (0:00:00.624) 0:15:59.675 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:37:56 -0400 (0:00:00.587) 0:16:00.262 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:37:56 -0400 (0:00:00.710) 0:16:00.972 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:37:57 -0400 (0:00:00.718) 0:16:01.690 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:38:00 -0400 (0:00:02.497) 0:16:04.188 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:38:04 -0400 (0:00:04.002) 0:16:08.191 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:38:04 -0400 (0:00:00.789) 0:16:08.980 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:38:05 -0400 (0:00:00.313) 0:16:09.294 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:38:14 -0400 (0:00:09.256) 0:16:18.551 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:38:15 -0400 (0:00:00.597) 0:16:19.148 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889745.4774857, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4a309b57d6d8b17f49acc2498cb574b6139985fb", "ctime": 1753889745.4744856, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753889745.4744856, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:38:16 -0400 (0:00:01.244) 0:16:20.393 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:38:17 -0400 (0:00:01.319) 0:16:21.712 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:38:17 -0400 (0:00:00.201) 0:16:21.914 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:38:18 -0400 (0:00:00.539) 0:16:22.454 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:38:18 -0400 (0:00:00.393) 0:16:22.847 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:38:19 -0400 (0:00:00.334) 0:16:23.182 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:38:20 -0400 (0:00:01.789) 0:16:24.971 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:38:22 -0400 (0:00:01.920) 0:16:26.892 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:38:24 -0400 (0:00:01.743) 0:16:28.636 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:38:25 -0400 (0:00:00.848) 0:16:29.485 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:38:27 -0400 (0:00:01.995) 0:16:31.480 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889761.080648, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3516a4c988b7cc9f79324772ab79382524beed3a", "ctime": 1753889752.0955544, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 226492872, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1753889752.0960858, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "1667907128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:38:28 -0400 (0:00:01.207) 0:16:32.687 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:38:31 -0400 (0:00:02.699) 0:16:35.387 ******** ok: [managed-node2] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:396 Wednesday 30 July 2025 11:38:33 -0400 (0:00:02.093) 0:16:37.481 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:38:33 -0400 (0:00:00.449) 0:16:37.931 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:38:34 -0400 (0:00:00.787) 0:16:38.718 ******** skipping: [managed-node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:38:35 -0400 (0:00:00.714) 0:16:39.432 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" }, "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "size": "4G", "type": "crypt", "uuid": "cdcaeb37-46de-471e-a757-9aa408ad34d1" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qDk4s8-0dDd-7m5R-lFEb-qbmQ-JAnq-zG3wwr" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:38:36 -0400 (0:00:01.269) 0:16:40.702 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002998", "end": "2025-07-30 11:38:37.762557", "rc": 0, "start": "2025-07-30 11:38:37.759559" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:38:37 -0400 (0:00:01.326) 0:16:42.060 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003069", "end": "2025-07-30 11:38:38.917181", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:38:38.914112" } STDOUT: luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:38:39 -0400 (0:00:01.155) 0:16:43.216 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 30 July 2025 11:38:40 -0400 (0:00:00.942) 0:16:44.158 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 30 July 2025 11:38:40 -0400 (0:00:00.350) 0:16:44.508 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.029275", "end": "2025-07-30 11:38:41.544794", "rc": 0, "start": "2025-07-30 11:38:41.515519" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 30 July 2025 11:38:41 -0400 (0:00:01.317) 0:16:45.826 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 30 July 2025 11:38:42 -0400 (0:00:00.378) 0:16:46.205 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 => (item=members) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 30 July 2025 11:38:42 -0400 (0:00:00.731) 0:16:46.936 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 30 July 2025 11:38:43 -0400 (0:00:00.920) 0:16:47.856 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 30 July 2025 11:38:48 -0400 (0:00:04.184) 0:16:52.040 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 30 July 2025 11:38:48 -0400 (0:00:00.701) 0:16:52.742 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 30 July 2025 11:38:49 -0400 (0:00:00.727) 0:16:53.469 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 30 July 2025 11:38:50 -0400 (0:00:00.771) 0:16:54.241 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 30 July 2025 11:38:50 -0400 (0:00:00.347) 0:16:54.588 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 30 July 2025 11:38:51 -0400 (0:00:00.666) 0:16:55.255 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 30 July 2025 11:38:51 -0400 (0:00:00.352) 0:16:55.607 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 30 July 2025 11:38:51 -0400 (0:00:00.389) 0:16:55.997 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:103484): WARNING **: 11:38:52.998: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_8.7p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.246 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.246 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 30 July 2025 11:38:53 -0400 (0:00:01.413) 0:16:57.411 ******** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 30 July 2025 11:38:54 -0400 (0:00:00.666) 0:16:58.078 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 30 July 2025 11:38:55 -0400 (0:00:01.951) 0:17:00.030 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 30 July 2025 11:38:56 -0400 (0:00:00.330) 0:17:00.361 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 30 July 2025 11:38:56 -0400 (0:00:00.323) 0:17:00.684 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 30 July 2025 11:38:56 -0400 (0:00:00.302) 0:17:00.987 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 30 July 2025 11:38:57 -0400 (0:00:00.293) 0:17:01.281 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 30 July 2025 11:38:57 -0400 (0:00:00.334) 0:17:01.615 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 30 July 2025 11:38:57 -0400 (0:00:00.351) 0:17:01.967 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 30 July 2025 11:38:58 -0400 (0:00:00.301) 0:17:02.269 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 30 July 2025 11:38:58 -0400 (0:00:00.279) 0:17:02.548 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 30 July 2025 11:38:58 -0400 (0:00:00.251) 0:17:02.799 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 30 July 2025 11:38:59 -0400 (0:00:00.264) 0:17:03.064 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 30 July 2025 11:38:59 -0400 (0:00:00.412) 0:17:03.477 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 30 July 2025 11:39:00 -0400 (0:00:00.756) 0:17:04.233 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 30 July 2025 11:39:00 -0400 (0:00:00.552) 0:17:04.786 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 30 July 2025 11:39:01 -0400 (0:00:00.300) 0:17:05.087 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 30 July 2025 11:39:01 -0400 (0:00:00.441) 0:17:05.528 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 30 July 2025 11:39:01 -0400 (0:00:00.376) 0:17:05.905 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 30 July 2025 11:39:02 -0400 (0:00:00.402) 0:17:06.308 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 30 July 2025 11:39:02 -0400 (0:00:00.366) 0:17:06.674 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 30 July 2025 11:39:03 -0400 (0:00:00.412) 0:17:07.087 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 30 July 2025 11:39:03 -0400 (0:00:00.442) 0:17:07.530 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 30 July 2025 11:39:04 -0400 (0:00:00.784) 0:17:08.314 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 30 July 2025 11:39:04 -0400 (0:00:00.629) 0:17:08.944 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 30 July 2025 11:39:05 -0400 (0:00:00.311) 0:17:09.256 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 30 July 2025 11:39:05 -0400 (0:00:00.328) 0:17:09.584 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 30 July 2025 11:39:05 -0400 (0:00:00.290) 0:17:09.875 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 30 July 2025 11:39:06 -0400 (0:00:00.300) 0:17:10.175 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 30 July 2025 11:39:06 -0400 (0:00:00.813) 0:17:10.992 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 30 July 2025 11:39:07 -0400 (0:00:00.648) 0:17:11.641 ******** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 30 July 2025 11:39:07 -0400 (0:00:00.282) 0:17:11.924 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 30 July 2025 11:39:08 -0400 (0:00:00.726) 0:17:12.651 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 30 July 2025 11:39:09 -0400 (0:00:00.754) 0:17:13.405 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 30 July 2025 11:39:10 -0400 (0:00:00.719) 0:17:14.124 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 30 July 2025 11:39:10 -0400 (0:00:00.579) 0:17:14.704 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 30 July 2025 11:39:11 -0400 (0:00:00.643) 0:17:15.348 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 30 July 2025 11:39:11 -0400 (0:00:00.625) 0:17:15.974 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 30 July 2025 11:39:12 -0400 (0:00:00.351) 0:17:16.326 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 30 July 2025 11:39:12 -0400 (0:00:00.300) 0:17:16.626 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 30 July 2025 11:39:13 -0400 (0:00:00.874) 0:17:17.501 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 30 July 2025 11:39:14 -0400 (0:00:00.633) 0:17:18.135 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 30 July 2025 11:39:14 -0400 (0:00:00.313) 0:17:18.448 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 30 July 2025 11:39:14 -0400 (0:00:00.332) 0:17:18.781 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 30 July 2025 11:39:15 -0400 (0:00:00.302) 0:17:19.083 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 30 July 2025 11:39:15 -0400 (0:00:00.299) 0:17:19.382 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 30 July 2025 11:39:15 -0400 (0:00:00.290) 0:17:19.673 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 30 July 2025 11:39:15 -0400 (0:00:00.271) 0:17:19.944 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 30 July 2025 11:39:16 -0400 (0:00:00.258) 0:17:20.203 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 30 July 2025 11:39:17 -0400 (0:00:00.858) 0:17:21.061 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 30 July 2025 11:39:17 -0400 (0:00:00.307) 0:17:21.369 ******** skipping: [managed-node2] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 30 July 2025 11:39:17 -0400 (0:00:00.283) 0:17:21.652 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 30 July 2025 11:39:17 -0400 (0:00:00.272) 0:17:21.925 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 30 July 2025 11:39:18 -0400 (0:00:00.379) 0:17:22.304 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 30 July 2025 11:39:18 -0400 (0:00:00.327) 0:17:22.632 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 30 July 2025 11:39:18 -0400 (0:00:00.262) 0:17:22.895 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 30 July 2025 11:39:19 -0400 (0:00:00.304) 0:17:23.199 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 30 July 2025 11:39:19 -0400 (0:00:00.268) 0:17:23.468 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:39:20 -0400 (0:00:00.586) 0:17:24.055 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:39:20 -0400 (0:00:00.742) 0:17:24.797 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:39:23 -0400 (0:00:03.028) 0:17:27.826 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:39:24 -0400 (0:00:00.449) 0:17:28.275 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:39:24 -0400 (0:00:00.735) 0:17:29.010 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:39:25 -0400 (0:00:00.807) 0:17:29.818 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:39:26 -0400 (0:00:00.354) 0:17:30.173 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:39:26 -0400 (0:00:00.617) 0:17:30.791 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:39:27 -0400 (0:00:00.608) 0:17:31.400 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:39:28 -0400 (0:00:00.703) 0:17:32.103 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:39:28 -0400 (0:00:00.281) 0:17:32.384 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:39:28 -0400 (0:00:00.288) 0:17:32.673 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:39:29 -0400 (0:00:00.390) 0:17:33.064 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:39:29 -0400 (0:00:00.319) 0:17:33.383 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:39:30 -0400 (0:00:00.943) 0:17:34.327 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:39:30 -0400 (0:00:00.660) 0:17:34.987 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:39:31 -0400 (0:00:00.710) 0:17:35.697 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:39:32 -0400 (0:00:00.667) 0:17:36.365 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:39:33 -0400 (0:00:00.824) 0:17:37.189 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:39:33 -0400 (0:00:00.373) 0:17:37.563 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:39:34 -0400 (0:00:00.644) 0:17:38.207 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:39:34 -0400 (0:00:00.794) 0:17:39.002 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889893.9200306, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889893.9200306, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1898, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753889893.9200306, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:39:36 -0400 (0:00:01.199) 0:17:40.201 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:39:36 -0400 (0:00:00.486) 0:17:40.687 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:39:36 -0400 (0:00:00.339) 0:17:41.027 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:39:37 -0400 (0:00:00.247) 0:17:41.275 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:39:37 -0400 (0:00:00.295) 0:17:41.570 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:39:37 -0400 (0:00:00.229) 0:17:41.800 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:39:38 -0400 (0:00:00.322) 0:17:42.122 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889894.1680331, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889894.1680331, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1946, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753889894.1680331, "nlink": 1, "path": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:39:39 -0400 (0:00:01.227) 0:17:43.349 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:39:41 -0400 (0:00:02.316) 0:17:45.665 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.006688", "end": "2025-07-30 11:39:42.678916", "rc": 0, "start": "2025-07-30 11:39:42.672228" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 75 fc 7e e7 84 22 55 77 54 b0 ed c4 bf 79 84 cf 1f 6a 75 f6 MK salt: b5 ef 5a 4d 6e 65 c9 48 d9 3e ba 95 bb 1f 87 3a f6 83 5b da 0e d8 d9 2b 4a 8f 43 80 c2 22 88 8d MK iterations: 129774 UUID: 8d0472a4-e7da-4e74-ad6e-553b7dfa8ada Key Slot 0: ENABLED Iterations: 2080506 Salt: be ad 31 90 b4 3f 43 32 be 94 0e 51 8f cf ab ce 54 dc 14 41 fa 42 c7 f6 c5 10 0a 14 dc 7c dc ba Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:39:42 -0400 (0:00:01.293) 0:17:46.959 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:39:43 -0400 (0:00:00.712) 0:17:47.672 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:39:44 -0400 (0:00:00.696) 0:17:48.369 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:39:44 -0400 (0:00:00.380) 0:17:48.750 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:39:45 -0400 (0:00:00.356) 0:17:49.106 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:39:46 -0400 (0:00:00.949) 0:17:50.056 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:39:46 -0400 (0:00:00.792) 0:17:50.848 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:39:47 -0400 (0:00:00.954) 0:17:51.802 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:39:48 -0400 (0:00:00.871) 0:17:52.674 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:39:49 -0400 (0:00:00.784) 0:17:53.458 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:39:50 -0400 (0:00:00.795) 0:17:54.254 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:39:51 -0400 (0:00:00.850) 0:17:55.105 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:39:51 -0400 (0:00:00.720) 0:17:55.825 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:39:52 -0400 (0:00:00.366) 0:17:56.191 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:39:52 -0400 (0:00:00.332) 0:17:56.524 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:39:52 -0400 (0:00:00.356) 0:17:56.880 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:39:53 -0400 (0:00:00.249) 0:17:57.129 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:39:53 -0400 (0:00:00.374) 0:17:57.504 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:39:53 -0400 (0:00:00.282) 0:17:57.787 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:39:54 -0400 (0:00:00.264) 0:17:58.051 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:39:54 -0400 (0:00:00.301) 0:17:58.353 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:39:54 -0400 (0:00:00.222) 0:17:58.575 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:39:54 -0400 (0:00:00.347) 0:17:58.923 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:39:55 -0400 (0:00:00.363) 0:17:59.286 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:39:58 -0400 (0:00:03.189) 0:18:02.476 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:40:00 -0400 (0:00:01.578) 0:18:04.055 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:40:00 -0400 (0:00:00.718) 0:18:04.773 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:40:01 -0400 (0:00:00.352) 0:18:05.125 ******** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:40:02 -0400 (0:00:01.727) 0:18:06.853 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:40:03 -0400 (0:00:00.869) 0:18:07.723 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:40:04 -0400 (0:00:00.713) 0:18:08.437 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:40:04 -0400 (0:00:00.572) 0:18:09.010 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:40:05 -0400 (0:00:00.618) 0:18:09.628 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:40:06 -0400 (0:00:00.635) 0:18:10.264 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:40:06 -0400 (0:00:00.738) 0:18:11.003 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:40:07 -0400 (0:00:00.715) 0:18:11.718 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:40:08 -0400 (0:00:00.676) 0:18:12.394 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:40:09 -0400 (0:00:00.723) 0:18:13.118 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:40:09 -0400 (0:00:00.593) 0:18:13.712 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:40:10 -0400 (0:00:00.730) 0:18:14.442 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:40:11 -0400 (0:00:00.776) 0:18:15.220 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:40:12 -0400 (0:00:00.872) 0:18:16.092 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:40:12 -0400 (0:00:00.656) 0:18:16.749 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:40:13 -0400 (0:00:00.658) 0:18:17.408 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:40:14 -0400 (0:00:00.744) 0:18:18.152 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:40:14 -0400 (0:00:00.682) 0:18:18.834 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:40:15 -0400 (0:00:00.775) 0:18:19.610 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:40:16 -0400 (0:00:00.710) 0:18:20.320 ******** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:40:17 -0400 (0:00:01.342) 0:18:21.662 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:40:17 -0400 (0:00:00.356) 0:18:22.019 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:40:18 -0400 (0:00:00.733) 0:18:22.752 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.030735", "end": "2025-07-30 11:40:19.766747", "rc": 0, "start": "2025-07-30 11:40:19.736012" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:40:20 -0400 (0:00:01.284) 0:18:24.037 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:40:20 -0400 (0:00:00.700) 0:18:24.738 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:40:21 -0400 (0:00:00.773) 0:18:25.511 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:40:22 -0400 (0:00:00.690) 0:18:26.202 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:40:22 -0400 (0:00:00.620) 0:18:26.823 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:40:23 -0400 (0:00:00.672) 0:18:27.495 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:40:24 -0400 (0:00:00.629) 0:18:28.125 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:40:24 -0400 (0:00:00.335) 0:18:28.461 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:40:25 -0400 (0:00:00.647) 0:18:29.108 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:399 Wednesday 30 July 2025 11:40:25 -0400 (0:00:00.337) 0:18:29.446 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:40:26 -0400 (0:00:00.785) 0:18:30.232 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:40:26 -0400 (0:00:00.509) 0:18:30.741 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:40:27 -0400 (0:00:00.578) 0:18:31.320 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:40:28 -0400 (0:00:00.868) 0:18:32.189 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:40:28 -0400 (0:00:00.304) 0:18:32.494 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:40:28 -0400 (0:00:00.336) 0:18:32.830 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:40:29 -0400 (0:00:00.276) 0:18:33.107 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:40:29 -0400 (0:00:00.321) 0:18:33.429 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:40:30 -0400 (0:00:00.988) 0:18:34.417 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:40:33 -0400 (0:00:02.739) 0:18:37.156 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:40:33 -0400 (0:00:00.651) 0:18:37.808 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:40:34 -0400 (0:00:00.845) 0:18:38.653 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:40:37 -0400 (0:00:02.731) 0:18:41.385 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:40:37 -0400 (0:00:00.605) 0:18:41.990 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:40:38 -0400 (0:00:00.478) 0:18:42.469 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:40:39 -0400 (0:00:00.606) 0:18:43.076 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:40:39 -0400 (0:00:00.781) 0:18:43.858 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:40:42 -0400 (0:00:02.719) 0:18:46.577 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service": { "name": "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:40:45 -0400 (0:00:03.117) 0:18:49.694 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:40:46 -0400 (0:00:01.035) 0:18:50.730 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8f215c83\x2de6b6\x2d4cf8\x2da3e2\x2d257c8e3b2a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "name": "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device cryptsetup-pre.target -.mount systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" tmp.mount systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d8f215c83\\\\x2de6b6\\\\x2d4cf8\\\\x2da3e2\\\\x2d257c8e3b2a81.target\" cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81 /dev/sda1 /tmp/storage_testcljrsrmzlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81 /dev/sda1 /tmp/storage_testcljrsrmzlukskey ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8f215c83-e6b6-4cf8-a3e2-257c8e3b2a81 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d8f215c83\\\\x2de6b6\\\\x2d4cf8\\\\x2da3e2\\\\x2d257c8e3b2a81.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\" -.mount", "RequiresMountsFor": "/tmp/storage_testcljrsrmzlukskey", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:38:27 EDT", "StateChangeTimestampMonotonic": "2813514352", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d8f215c83\\\\x2de6b6\\\\x2d4cf8\\\\x2da3e2\\\\x2d257c8e3b2a81.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:40:48 -0400 (0:00:01.705) 0:18:52.435 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:40:51 -0400 (0:00:02.700) 0:18:55.136 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:40:51 -0400 (0:00:00.595) 0:18:55.731 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889904.3281388, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3fc681986f832d34bc1c827f24cf8dbf90685177", "ctime": 1753889904.3251388, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753889904.3251388, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:40:53 -0400 (0:00:01.440) 0:18:57.172 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:40:53 -0400 (0:00:00.410) 0:18:57.583 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8f215c83\x2de6b6\x2d4cf8\x2da3e2\x2d257c8e3b2a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "name": "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454152192", "LimitMEMLOCKSoft": "454152192", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d8f215c83\\x2de6b6\\x2d4cf8\\x2da3e2\\x2d257c8e3b2a81.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d8f215c83\\\\x2de6b6\\\\x2d4cf8\\\\x2da3e2\\\\x2d257c8e3b2a81.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:40:55 -0400 (0:00:01.694) 0:18:59.277 ******** ok: [managed-node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:40:55 -0400 (0:00:00.433) 0:18:59.710 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:40:56 -0400 (0:00:00.376) 0:19:00.087 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:40:56 -0400 (0:00:00.353) 0:19:00.441 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:40:57 -0400 (0:00:00.770) 0:19:01.212 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:40:59 -0400 (0:00:01.894) 0:19:03.107 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [managed-node2] => (item={'src': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:41:00 -0400 (0:00:01.787) 0:19:04.894 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:41:01 -0400 (0:00:00.789) 0:19:05.684 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:41:03 -0400 (0:00:02.083) 0:19:07.767 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889918.9162908, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b2d5d8950c6fbd768245af804c46102ba93a4a31", "ctime": 1753889911.1362097, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 469762292, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1753889911.1368506, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1482270363", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:41:05 -0400 (0:00:01.316) 0:19:09.084 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:41:05 -0400 (0:00:00.332) 0:19:09.416 ******** ok: [managed-node2] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:413 Wednesday 30 July 2025 11:41:07 -0400 (0:00:02.143) 0:19:11.560 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:420 Wednesday 30 July 2025 11:41:07 -0400 (0:00:00.413) 0:19:11.974 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:41:08 -0400 (0:00:00.733) 0:19:12.707 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:41:09 -0400 (0:00:00.705) 0:19:13.412 ******** skipping: [managed-node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:41:09 -0400 (0:00:00.572) 0:19:13.984 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" }, "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "size": "4G", "type": "crypt", "uuid": "cdcaeb37-46de-471e-a757-9aa408ad34d1" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qDk4s8-0dDd-7m5R-lFEb-qbmQ-JAnq-zG3wwr" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:41:11 -0400 (0:00:01.277) 0:19:15.262 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003403", "end": "2025-07-30 11:41:12.311751", "rc": 0, "start": "2025-07-30 11:41:12.308348" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:41:12 -0400 (0:00:01.314) 0:19:16.577 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003497", "end": "2025-07-30 11:41:13.525786", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:41:13.522289" } STDOUT: luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:41:13 -0400 (0:00:01.283) 0:19:17.860 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 30 July 2025 11:41:17 -0400 (0:00:03.223) 0:19:21.083 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 30 July 2025 11:41:17 -0400 (0:00:00.320) 0:19:21.404 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.030578", "end": "2025-07-30 11:41:18.522836", "rc": 0, "start": "2025-07-30 11:41:18.492258" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 30 July 2025 11:41:18 -0400 (0:00:01.386) 0:19:22.791 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 30 July 2025 11:41:19 -0400 (0:00:00.423) 0:19:23.214 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 => (item=members) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 30 July 2025 11:41:19 -0400 (0:00:00.637) 0:19:23.852 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 30 July 2025 11:41:20 -0400 (0:00:00.774) 0:19:24.626 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 30 July 2025 11:41:21 -0400 (0:00:01.392) 0:19:26.019 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 30 July 2025 11:41:22 -0400 (0:00:00.859) 0:19:26.878 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 30 July 2025 11:41:23 -0400 (0:00:00.736) 0:19:27.615 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 30 July 2025 11:41:24 -0400 (0:00:00.605) 0:19:28.220 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 30 July 2025 11:41:24 -0400 (0:00:00.327) 0:19:28.548 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 30 July 2025 11:41:25 -0400 (0:00:00.629) 0:19:29.178 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 30 July 2025 11:41:25 -0400 (0:00:00.353) 0:19:29.531 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 30 July 2025 11:41:25 -0400 (0:00:00.373) 0:19:29.905 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:107904): WARNING **: 11:41:26.833: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_8.7p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.246 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.246 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 30 July 2025 11:41:27 -0400 (0:00:01.354) 0:19:31.259 ******** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 30 July 2025 11:41:28 -0400 (0:00:00.851) 0:19:32.110 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 30 July 2025 11:41:28 -0400 (0:00:00.740) 0:19:32.851 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 30 July 2025 11:41:29 -0400 (0:00:00.263) 0:19:33.114 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 30 July 2025 11:41:29 -0400 (0:00:00.366) 0:19:33.480 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 30 July 2025 11:41:29 -0400 (0:00:00.275) 0:19:33.756 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 30 July 2025 11:41:30 -0400 (0:00:00.310) 0:19:34.066 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 30 July 2025 11:41:30 -0400 (0:00:00.340) 0:19:34.408 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 30 July 2025 11:41:30 -0400 (0:00:00.255) 0:19:34.663 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 30 July 2025 11:41:30 -0400 (0:00:00.242) 0:19:34.905 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 30 July 2025 11:41:31 -0400 (0:00:00.350) 0:19:35.256 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 30 July 2025 11:41:31 -0400 (0:00:00.278) 0:19:35.534 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 30 July 2025 11:41:31 -0400 (0:00:00.328) 0:19:35.862 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 30 July 2025 11:41:32 -0400 (0:00:00.320) 0:19:36.183 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 30 July 2025 11:41:32 -0400 (0:00:00.735) 0:19:36.919 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 30 July 2025 11:41:33 -0400 (0:00:00.479) 0:19:37.399 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 30 July 2025 11:41:33 -0400 (0:00:00.311) 0:19:37.710 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 30 July 2025 11:41:33 -0400 (0:00:00.317) 0:19:38.028 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 30 July 2025 11:41:34 -0400 (0:00:00.323) 0:19:38.351 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 30 July 2025 11:41:34 -0400 (0:00:00.350) 0:19:38.701 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 30 July 2025 11:41:34 -0400 (0:00:00.322) 0:19:39.023 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 30 July 2025 11:41:35 -0400 (0:00:00.377) 0:19:39.401 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 30 July 2025 11:41:35 -0400 (0:00:00.350) 0:19:39.752 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 30 July 2025 11:41:36 -0400 (0:00:00.815) 0:19:40.568 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 30 July 2025 11:41:37 -0400 (0:00:00.608) 0:19:41.176 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 30 July 2025 11:41:37 -0400 (0:00:00.253) 0:19:41.430 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 30 July 2025 11:41:37 -0400 (0:00:00.288) 0:19:41.719 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 30 July 2025 11:41:37 -0400 (0:00:00.279) 0:19:41.998 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 30 July 2025 11:41:38 -0400 (0:00:00.269) 0:19:42.268 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 30 July 2025 11:41:38 -0400 (0:00:00.739) 0:19:43.007 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 30 July 2025 11:41:39 -0400 (0:00:00.768) 0:19:43.776 ******** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 30 July 2025 11:41:40 -0400 (0:00:00.332) 0:19:44.109 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 30 July 2025 11:41:40 -0400 (0:00:00.623) 0:19:44.733 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 30 July 2025 11:41:41 -0400 (0:00:00.708) 0:19:45.441 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 30 July 2025 11:41:42 -0400 (0:00:00.634) 0:19:46.076 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 30 July 2025 11:41:42 -0400 (0:00:00.688) 0:19:46.765 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 30 July 2025 11:41:43 -0400 (0:00:00.658) 0:19:47.423 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 30 July 2025 11:41:44 -0400 (0:00:00.682) 0:19:48.105 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 30 July 2025 11:41:44 -0400 (0:00:00.260) 0:19:48.366 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 30 July 2025 11:41:44 -0400 (0:00:00.304) 0:19:48.671 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 30 July 2025 11:41:45 -0400 (0:00:00.855) 0:19:49.526 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 30 July 2025 11:41:46 -0400 (0:00:00.654) 0:19:50.181 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 30 July 2025 11:41:46 -0400 (0:00:00.277) 0:19:50.459 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 30 July 2025 11:41:46 -0400 (0:00:00.343) 0:19:50.803 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 30 July 2025 11:41:47 -0400 (0:00:00.305) 0:19:51.109 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 30 July 2025 11:41:47 -0400 (0:00:00.311) 0:19:51.420 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 30 July 2025 11:41:48 -0400 (0:00:01.266) 0:19:52.686 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 30 July 2025 11:41:49 -0400 (0:00:00.367) 0:19:53.054 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 30 July 2025 11:41:49 -0400 (0:00:00.390) 0:19:53.445 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 30 July 2025 11:41:50 -0400 (0:00:00.811) 0:19:54.256 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 30 July 2025 11:41:50 -0400 (0:00:00.308) 0:19:54.564 ******** skipping: [managed-node2] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 30 July 2025 11:41:50 -0400 (0:00:00.287) 0:19:54.851 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 30 July 2025 11:41:51 -0400 (0:00:00.243) 0:19:55.095 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 30 July 2025 11:41:51 -0400 (0:00:00.302) 0:19:55.397 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 30 July 2025 11:41:51 -0400 (0:00:00.266) 0:19:55.663 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 30 July 2025 11:41:51 -0400 (0:00:00.290) 0:19:55.954 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 30 July 2025 11:41:52 -0400 (0:00:00.369) 0:19:56.324 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 30 July 2025 11:41:52 -0400 (0:00:00.427) 0:19:56.751 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:41:53 -0400 (0:00:00.597) 0:19:57.349 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:41:53 -0400 (0:00:00.529) 0:19:57.878 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:41:55 -0400 (0:00:02.017) 0:19:59.896 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:41:56 -0400 (0:00:00.433) 0:20:00.330 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:41:57 -0400 (0:00:00.764) 0:20:01.095 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:41:57 -0400 (0:00:00.777) 0:20:01.872 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:41:58 -0400 (0:00:00.434) 0:20:02.307 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:41:58 -0400 (0:00:00.715) 0:20:03.022 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:41:59 -0400 (0:00:00.725) 0:20:03.747 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:42:00 -0400 (0:00:00.778) 0:20:04.526 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:42:00 -0400 (0:00:00.325) 0:20:04.852 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:42:01 -0400 (0:00:00.318) 0:20:05.170 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:42:01 -0400 (0:00:00.301) 0:20:05.472 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:42:01 -0400 (0:00:00.262) 0:20:05.735 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:42:02 -0400 (0:00:01.043) 0:20:06.778 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:42:03 -0400 (0:00:00.741) 0:20:07.520 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:42:04 -0400 (0:00:00.803) 0:20:08.324 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:42:04 -0400 (0:00:00.636) 0:20:08.961 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:42:05 -0400 (0:00:00.736) 0:20:09.697 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:42:05 -0400 (0:00:00.302) 0:20:10.000 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:42:06 -0400 (0:00:00.712) 0:20:10.712 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:42:07 -0400 (0:00:00.670) 0:20:11.382 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889982.676954, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889893.9200306, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1898, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753889893.9200306, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:42:08 -0400 (0:00:01.226) 0:20:12.609 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:42:08 -0400 (0:00:00.383) 0:20:12.992 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:42:09 -0400 (0:00:00.323) 0:20:13.315 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:42:09 -0400 (0:00:00.331) 0:20:13.647 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:42:09 -0400 (0:00:00.264) 0:20:13.912 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:42:10 -0400 (0:00:00.217) 0:20:14.131 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:42:10 -0400 (0:00:00.313) 0:20:14.445 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890050.7086616, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753889894.1680331, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1946, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753889894.1680331, "nlink": 1, "path": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:42:11 -0400 (0:00:01.200) 0:20:15.646 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:42:14 -0400 (0:00:02.480) 0:20:18.126 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.006856", "end": "2025-07-30 11:42:15.066179", "rc": 0, "start": "2025-07-30 11:42:15.059323" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 75 fc 7e e7 84 22 55 77 54 b0 ed c4 bf 79 84 cf 1f 6a 75 f6 MK salt: b5 ef 5a 4d 6e 65 c9 48 d9 3e ba 95 bb 1f 87 3a f6 83 5b da 0e d8 d9 2b 4a 8f 43 80 c2 22 88 8d MK iterations: 129774 UUID: 8d0472a4-e7da-4e74-ad6e-553b7dfa8ada Key Slot 0: ENABLED Iterations: 2080506 Salt: be ad 31 90 b4 3f 43 32 be 94 0e 51 8f cf ab ce 54 dc 14 41 fa 42 c7 f6 c5 10 0a 14 dc 7c dc ba Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:42:15 -0400 (0:00:01.221) 0:20:19.348 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:42:16 -0400 (0:00:00.747) 0:20:20.096 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:42:16 -0400 (0:00:00.775) 0:20:20.872 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:42:17 -0400 (0:00:00.421) 0:20:21.293 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:42:17 -0400 (0:00:00.299) 0:20:21.592 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:42:18 -0400 (0:00:00.901) 0:20:22.494 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption_key_size > 0", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:42:18 -0400 (0:00:00.390) 0:20:22.884 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:42:19 -0400 (0:00:00.380) 0:20:23.289 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:42:20 -0400 (0:00:01.065) 0:20:24.355 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:42:20 -0400 (0:00:00.669) 0:20:25.024 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:42:21 -0400 (0:00:00.763) 0:20:25.787 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:42:22 -0400 (0:00:00.629) 0:20:26.416 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:42:23 -0400 (0:00:00.709) 0:20:27.126 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:42:23 -0400 (0:00:00.292) 0:20:27.419 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:42:23 -0400 (0:00:00.263) 0:20:27.682 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:42:23 -0400 (0:00:00.257) 0:20:27.944 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:42:24 -0400 (0:00:00.135) 0:20:28.080 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:42:24 -0400 (0:00:00.318) 0:20:28.399 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:42:24 -0400 (0:00:00.233) 0:20:28.632 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:42:24 -0400 (0:00:00.226) 0:20:28.858 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:42:25 -0400 (0:00:00.262) 0:20:29.120 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:42:25 -0400 (0:00:00.319) 0:20:29.445 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:42:25 -0400 (0:00:00.256) 0:20:29.701 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:42:26 -0400 (0:00:00.347) 0:20:30.049 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:42:27 -0400 (0:00:01.704) 0:20:31.753 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:42:29 -0400 (0:00:01.402) 0:20:33.155 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:42:29 -0400 (0:00:00.632) 0:20:33.788 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:42:30 -0400 (0:00:00.380) 0:20:34.168 ******** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:42:31 -0400 (0:00:01.573) 0:20:35.742 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:42:32 -0400 (0:00:00.681) 0:20:36.423 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:42:33 -0400 (0:00:00.718) 0:20:37.142 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:42:33 -0400 (0:00:00.605) 0:20:37.747 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:42:34 -0400 (0:00:00.562) 0:20:38.310 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:42:34 -0400 (0:00:00.715) 0:20:39.025 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:42:35 -0400 (0:00:00.597) 0:20:39.623 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:42:36 -0400 (0:00:00.654) 0:20:40.278 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:42:36 -0400 (0:00:00.621) 0:20:40.900 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:42:37 -0400 (0:00:00.599) 0:20:41.500 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:42:38 -0400 (0:00:00.761) 0:20:42.261 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:42:38 -0400 (0:00:00.757) 0:20:43.019 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:42:39 -0400 (0:00:00.669) 0:20:43.688 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:42:40 -0400 (0:00:00.629) 0:20:44.317 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:42:40 -0400 (0:00:00.613) 0:20:44.931 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:42:41 -0400 (0:00:00.552) 0:20:45.483 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:42:42 -0400 (0:00:00.651) 0:20:46.135 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:42:45 -0400 (0:00:03.026) 0:20:49.162 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:42:45 -0400 (0:00:00.705) 0:20:49.867 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:42:46 -0400 (0:00:00.643) 0:20:50.511 ******** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:42:46 -0400 (0:00:00.330) 0:20:50.842 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:42:47 -0400 (0:00:00.345) 0:20:51.187 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:42:47 -0400 (0:00:00.734) 0:20:51.922 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.033054", "end": "2025-07-30 11:42:49.007847", "rc": 0, "start": "2025-07-30 11:42:48.974793" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:42:49 -0400 (0:00:01.387) 0:20:53.309 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:42:49 -0400 (0:00:00.616) 0:20:53.926 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:42:50 -0400 (0:00:00.794) 0:20:54.721 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:42:51 -0400 (0:00:00.654) 0:20:55.375 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:42:51 -0400 (0:00:00.609) 0:20:55.985 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:42:52 -0400 (0:00:00.583) 0:20:56.568 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:42:53 -0400 (0:00:00.692) 0:20:57.261 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:42:53 -0400 (0:00:00.355) 0:20:57.616 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:42:54 -0400 (0:00:00.555) 0:20:58.172 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 30 July 2025 11:42:54 -0400 (0:00:00.318) 0:20:58.491 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:426 Wednesday 30 July 2025 11:42:55 -0400 (0:00:01.345) 0:20:59.836 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 30 July 2025 11:42:56 -0400 (0:00:00.738) 0:21:00.575 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 30 July 2025 11:42:57 -0400 (0:00:00.667) 0:21:01.242 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:42:57 -0400 (0:00:00.553) 0:21:01.795 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:42:58 -0400 (0:00:00.520) 0:21:02.315 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:42:58 -0400 (0:00:00.649) 0:21:02.965 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:42:59 -0400 (0:00:00.815) 0:21:03.781 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:43:00 -0400 (0:00:00.336) 0:21:04.118 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:43:00 -0400 (0:00:00.286) 0:21:04.404 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:43:00 -0400 (0:00:00.387) 0:21:04.791 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:43:01 -0400 (0:00:00.305) 0:21:05.097 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:43:01 -0400 (0:00:00.875) 0:21:05.972 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:43:04 -0400 (0:00:02.640) 0:21:08.612 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:43:05 -0400 (0:00:00.870) 0:21:09.482 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:43:06 -0400 (0:00:00.641) 0:21:10.124 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:43:08 -0400 (0:00:02.696) 0:21:12.820 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:43:09 -0400 (0:00:00.387) 0:21:13.208 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:43:09 -0400 (0:00:00.379) 0:21:13.592 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:43:10 -0400 (0:00:00.672) 0:21:14.264 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:43:10 -0400 (0:00:00.586) 0:21:14.851 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:43:13 -0400 (0:00:02.647) 0:21:17.498 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service": { "name": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:43:17 -0400 (0:00:04.122) 0:21:21.621 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:43:18 -0400 (0:00:00.896) 0:21:22.518 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8d0472a4\x2de7da\x2d4e74\x2dad6e\x2d553b7dfa8ada.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "name": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-udevd-kernel.socket \"dev-mapper-foo\\\\x2dtest1.device\" \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.target\"", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:40:54 EDT", "StateChangeTimestampMonotonic": "2961321992", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:43:20 -0400 (0:00:01.629) 0:21:24.148 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 30 July 2025 11:43:22 -0400 (0:00:02.816) 0:21:26.964 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:43:23 -0400 (0:00:00.442) 0:21:27.407 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8d0472a4\x2de7da\x2d4e74\x2dad6e\x2d553b7dfa8ada.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "name": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454152192", "LimitMEMLOCKSoft": "454152192", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.device\" cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:40:54 EDT", "StateChangeTimestampMonotonic": "2961321992", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 30 July 2025 11:43:25 -0400 (0:00:01.884) 0:21:29.292 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 30 July 2025 11:43:25 -0400 (0:00:00.308) 0:21:29.601 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 30 July 2025 11:43:26 -0400 (0:00:00.450) 0:21:30.051 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 30 July 2025 11:43:26 -0400 (0:00:00.348) 0:21:30.400 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890175.54396, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753890175.54396, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1753890175.54396, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3237715491", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 30 July 2025 11:43:27 -0400 (0:00:01.245) 0:21:31.645 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:449 Wednesday 30 July 2025 11:43:28 -0400 (0:00:00.524) 0:21:32.170 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:43:29 -0400 (0:00:01.070) 0:21:33.240 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:43:29 -0400 (0:00:00.567) 0:21:33.808 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:43:30 -0400 (0:00:00.639) 0:21:34.447 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:43:31 -0400 (0:00:00.883) 0:21:35.331 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:43:31 -0400 (0:00:00.402) 0:21:35.733 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:43:32 -0400 (0:00:00.386) 0:21:36.120 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:43:32 -0400 (0:00:00.303) 0:21:36.423 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:43:32 -0400 (0:00:00.412) 0:21:36.835 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:43:33 -0400 (0:00:00.904) 0:21:37.740 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:43:36 -0400 (0:00:02.723) 0:21:40.463 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:43:37 -0400 (0:00:00.706) 0:21:41.169 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:43:40 -0400 (0:00:03.026) 0:21:44.196 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:43:42 -0400 (0:00:02.637) 0:21:46.834 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:43:43 -0400 (0:00:00.676) 0:21:47.510 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:43:44 -0400 (0:00:00.549) 0:21:48.060 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:43:44 -0400 (0:00:00.671) 0:21:48.732 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:43:45 -0400 (0:00:00.727) 0:21:49.459 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:43:47 -0400 (0:00:02.555) 0:21:52.015 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service": { "name": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:43:50 -0400 (0:00:02.961) 0:21:54.977 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:43:51 -0400 (0:00:00.873) 0:21:55.851 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8d0472a4\x2de7da\x2d4e74\x2dad6e\x2d553b7dfa8ada.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "name": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket \"dev-mapper-foo\\\\x2dtest1.device\" \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.target\"", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:40:54 EDT", "StateChangeTimestampMonotonic": "2961321992", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:43:53 -0400 (0:00:01.643) 0:21:57.494 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:43:56 -0400 (0:00:03.453) 0:22:00.948 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:43:57 -0400 (0:00:00.684) 0:22:01.632 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889904.3281388, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3fc681986f832d34bc1c827f24cf8dbf90685177", "ctime": 1753889904.3251388, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753889904.3251388, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:43:59 -0400 (0:00:01.434) 0:22:03.067 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:44:00 -0400 (0:00:01.502) 0:22:04.569 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8d0472a4\x2de7da\x2d4e74\x2dad6e\x2d553b7dfa8ada.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "name": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454152192", "LimitMEMLOCKSoft": "454152192", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.device\" cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:40:54 EDT", "StateChangeTimestampMonotonic": "2961321992", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:44:02 -0400 (0:00:01.666) 0:22:06.236 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:44:02 -0400 (0:00:00.448) 0:22:06.685 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:44:03 -0400 (0:00:00.472) 0:22:07.157 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:44:03 -0400 (0:00:00.428) 0:22:07.585 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:44:05 -0400 (0:00:01.997) 0:22:09.583 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:44:07 -0400 (0:00:01.858) 0:22:11.442 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:44:09 -0400 (0:00:01.758) 0:22:13.222 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:44:09 -0400 (0:00:00.739) 0:22:13.961 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:44:11 -0400 (0:00:02.066) 0:22:16.028 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753889918.9162908, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b2d5d8950c6fbd768245af804c46102ba93a4a31", "ctime": 1753889911.1362097, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 469762292, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1753889911.1368506, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1482270363", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:44:13 -0400 (0:00:01.377) 0:22:17.405 ******** changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:44:15 -0400 (0:00:01.786) 0:22:19.191 ******** ok: [managed-node2] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:465 Wednesday 30 July 2025 11:44:17 -0400 (0:00:02.171) 0:22:21.362 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:44:18 -0400 (0:00:00.837) 0:22:22.200 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:44:18 -0400 (0:00:00.724) 0:22:22.925 ******** skipping: [managed-node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:44:19 -0400 (0:00:00.631) 0:22:23.597 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "09437dad-c2fb-4124-807a-166bd4362d49" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qDk4s8-0dDd-7m5R-lFEb-qbmQ-JAnq-zG3wwr" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:44:20 -0400 (0:00:01.234) 0:22:24.831 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002948", "end": "2025-07-30 11:44:21.799778", "rc": 0, "start": "2025-07-30 11:44:21.796830" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:44:22 -0400 (0:00:01.284) 0:22:26.116 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002928", "end": "2025-07-30 11:44:23.164291", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:44:23.161363" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:44:23 -0400 (0:00:01.308) 0:22:27.425 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 30 July 2025 11:44:24 -0400 (0:00:00.980) 0:22:28.405 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 30 July 2025 11:44:24 -0400 (0:00:00.193) 0:22:28.598 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.028829", "end": "2025-07-30 11:44:25.546836", "rc": 0, "start": "2025-07-30 11:44:25.518007" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 30 July 2025 11:44:25 -0400 (0:00:01.257) 0:22:29.856 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 30 July 2025 11:44:26 -0400 (0:00:00.427) 0:22:30.283 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 => (item=members) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 30 July 2025 11:44:27 -0400 (0:00:00.762) 0:22:31.046 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 30 July 2025 11:44:27 -0400 (0:00:00.864) 0:22:31.910 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 30 July 2025 11:44:29 -0400 (0:00:01.383) 0:22:33.294 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 30 July 2025 11:44:29 -0400 (0:00:00.686) 0:22:33.981 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 30 July 2025 11:44:30 -0400 (0:00:00.547) 0:22:34.529 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 30 July 2025 11:44:31 -0400 (0:00:00.847) 0:22:35.376 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 30 July 2025 11:44:31 -0400 (0:00:00.400) 0:22:35.777 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 30 July 2025 11:44:32 -0400 (0:00:00.612) 0:22:36.390 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 30 July 2025 11:44:32 -0400 (0:00:00.380) 0:22:36.771 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 30 July 2025 11:44:33 -0400 (0:00:00.419) 0:22:37.191 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:114421): WARNING **: 11:44:34.073: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_8.7p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.246 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.246 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 30 July 2025 11:44:34 -0400 (0:00:01.223) 0:22:38.414 ******** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 30 July 2025 11:44:35 -0400 (0:00:00.835) 0:22:39.250 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 30 July 2025 11:44:36 -0400 (0:00:00.804) 0:22:40.054 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 30 July 2025 11:44:36 -0400 (0:00:00.309) 0:22:40.363 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 30 July 2025 11:44:36 -0400 (0:00:00.309) 0:22:40.672 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 30 July 2025 11:44:36 -0400 (0:00:00.279) 0:22:40.952 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 30 July 2025 11:44:37 -0400 (0:00:00.310) 0:22:41.263 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 30 July 2025 11:44:37 -0400 (0:00:00.318) 0:22:41.581 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 30 July 2025 11:44:37 -0400 (0:00:00.354) 0:22:41.936 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 30 July 2025 11:44:38 -0400 (0:00:00.316) 0:22:42.252 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 30 July 2025 11:44:38 -0400 (0:00:00.270) 0:22:42.523 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 30 July 2025 11:44:38 -0400 (0:00:00.303) 0:22:42.826 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 30 July 2025 11:44:39 -0400 (0:00:00.319) 0:22:43.146 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 30 July 2025 11:44:39 -0400 (0:00:00.314) 0:22:43.460 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 30 July 2025 11:44:40 -0400 (0:00:00.644) 0:22:44.104 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 30 July 2025 11:44:41 -0400 (0:00:01.890) 0:22:45.997 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 30 July 2025 11:44:42 -0400 (0:00:00.390) 0:22:46.387 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 30 July 2025 11:44:42 -0400 (0:00:00.398) 0:22:46.785 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 30 July 2025 11:44:43 -0400 (0:00:00.384) 0:22:47.169 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 30 July 2025 11:44:43 -0400 (0:00:00.374) 0:22:47.543 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 30 July 2025 11:44:43 -0400 (0:00:00.387) 0:22:47.931 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 30 July 2025 11:44:44 -0400 (0:00:00.506) 0:22:48.438 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 30 July 2025 11:44:44 -0400 (0:00:00.421) 0:22:48.859 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 30 July 2025 11:44:45 -0400 (0:00:00.864) 0:22:49.723 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 30 July 2025 11:44:46 -0400 (0:00:00.567) 0:22:50.291 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 30 July 2025 11:44:46 -0400 (0:00:00.266) 0:22:50.557 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 30 July 2025 11:44:46 -0400 (0:00:00.253) 0:22:50.811 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 30 July 2025 11:44:47 -0400 (0:00:00.320) 0:22:51.131 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 30 July 2025 11:44:47 -0400 (0:00:00.305) 0:22:51.437 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 30 July 2025 11:44:48 -0400 (0:00:00.851) 0:22:52.289 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 30 July 2025 11:44:48 -0400 (0:00:00.650) 0:22:52.939 ******** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 30 July 2025 11:44:49 -0400 (0:00:00.359) 0:22:53.299 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 30 July 2025 11:44:49 -0400 (0:00:00.654) 0:22:53.954 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 30 July 2025 11:44:50 -0400 (0:00:00.880) 0:22:54.834 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 30 July 2025 11:44:51 -0400 (0:00:00.790) 0:22:55.625 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 30 July 2025 11:44:52 -0400 (0:00:00.616) 0:22:56.241 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 30 July 2025 11:44:52 -0400 (0:00:00.607) 0:22:56.849 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 30 July 2025 11:44:53 -0400 (0:00:00.717) 0:22:57.566 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 30 July 2025 11:44:53 -0400 (0:00:00.352) 0:22:57.919 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 30 July 2025 11:44:54 -0400 (0:00:00.401) 0:22:58.320 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 30 July 2025 11:44:55 -0400 (0:00:00.930) 0:22:59.250 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 30 July 2025 11:44:55 -0400 (0:00:00.630) 0:22:59.881 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 30 July 2025 11:44:56 -0400 (0:00:00.340) 0:23:00.221 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 30 July 2025 11:44:56 -0400 (0:00:00.345) 0:23:00.567 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 30 July 2025 11:44:56 -0400 (0:00:00.335) 0:23:00.902 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 30 July 2025 11:44:57 -0400 (0:00:00.246) 0:23:01.148 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 30 July 2025 11:44:57 -0400 (0:00:00.267) 0:23:01.416 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 30 July 2025 11:44:57 -0400 (0:00:00.290) 0:23:01.707 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 30 July 2025 11:44:58 -0400 (0:00:00.410) 0:23:02.117 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 30 July 2025 11:44:58 -0400 (0:00:00.867) 0:23:02.984 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 30 July 2025 11:44:59 -0400 (0:00:00.314) 0:23:03.299 ******** skipping: [managed-node2] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 30 July 2025 11:44:59 -0400 (0:00:00.283) 0:23:03.582 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 30 July 2025 11:44:59 -0400 (0:00:00.329) 0:23:03.937 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 30 July 2025 11:45:00 -0400 (0:00:00.325) 0:23:04.262 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 30 July 2025 11:45:00 -0400 (0:00:00.311) 0:23:04.573 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 30 July 2025 11:45:00 -0400 (0:00:00.261) 0:23:04.834 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 30 July 2025 11:45:01 -0400 (0:00:00.347) 0:23:05.182 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 30 July 2025 11:45:01 -0400 (0:00:00.258) 0:23:05.440 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:45:01 -0400 (0:00:00.531) 0:23:05.971 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:45:02 -0400 (0:00:00.762) 0:23:06.734 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:45:04 -0400 (0:00:02.016) 0:23:08.750 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:45:05 -0400 (0:00:00.441) 0:23:09.191 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:45:05 -0400 (0:00:00.687) 0:23:09.878 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:45:06 -0400 (0:00:00.785) 0:23:10.663 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:45:07 -0400 (0:00:00.410) 0:23:11.074 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:45:07 -0400 (0:00:00.682) 0:23:11.757 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:45:08 -0400 (0:00:00.701) 0:23:12.458 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:45:09 -0400 (0:00:00.813) 0:23:13.272 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:45:09 -0400 (0:00:00.278) 0:23:13.551 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:45:09 -0400 (0:00:00.277) 0:23:13.828 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:45:10 -0400 (0:00:00.268) 0:23:14.096 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:45:10 -0400 (0:00:00.255) 0:23:14.352 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:45:11 -0400 (0:00:01.054) 0:23:15.406 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:45:12 -0400 (0:00:00.717) 0:23:16.124 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:45:12 -0400 (0:00:00.668) 0:23:16.792 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:45:13 -0400 (0:00:00.539) 0:23:17.332 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:45:14 -0400 (0:00:00.742) 0:23:18.074 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:45:14 -0400 (0:00:00.305) 0:23:18.379 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:45:15 -0400 (0:00:00.791) 0:23:19.171 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:45:15 -0400 (0:00:00.743) 0:23:19.915 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890236.5465946, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753890236.5465946, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2034, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753890236.5465946, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:45:17 -0400 (0:00:01.307) 0:23:21.222 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:45:17 -0400 (0:00:00.311) 0:23:21.533 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:45:19 -0400 (0:00:01.562) 0:23:23.096 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:45:19 -0400 (0:00:00.351) 0:23:23.448 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:45:19 -0400 (0:00:00.347) 0:23:23.795 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:45:20 -0400 (0:00:00.277) 0:23:24.073 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:45:20 -0400 (0:00:00.362) 0:23:24.436 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:45:20 -0400 (0:00:00.329) 0:23:24.765 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:45:23 -0400 (0:00:02.551) 0:23:27.317 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:45:23 -0400 (0:00:00.247) 0:23:27.564 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:45:23 -0400 (0:00:00.244) 0:23:27.809 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:45:24 -0400 (0:00:00.763) 0:23:28.573 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:45:24 -0400 (0:00:00.298) 0:23:28.872 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:45:25 -0400 (0:00:00.288) 0:23:29.160 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:45:25 -0400 (0:00:00.382) 0:23:29.543 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:45:25 -0400 (0:00:00.310) 0:23:29.853 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:45:26 -0400 (0:00:00.321) 0:23:30.174 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:45:26 -0400 (0:00:00.772) 0:23:30.947 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:45:27 -0400 (0:00:00.774) 0:23:31.721 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:45:28 -0400 (0:00:00.800) 0:23:32.522 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:45:29 -0400 (0:00:00.694) 0:23:33.216 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:45:29 -0400 (0:00:00.640) 0:23:33.857 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:45:30 -0400 (0:00:00.397) 0:23:34.255 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:45:30 -0400 (0:00:00.313) 0:23:34.568 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:45:30 -0400 (0:00:00.255) 0:23:34.824 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:45:31 -0400 (0:00:00.332) 0:23:35.157 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:45:31 -0400 (0:00:00.400) 0:23:35.557 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:45:31 -0400 (0:00:00.296) 0:23:35.854 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:45:32 -0400 (0:00:00.322) 0:23:36.177 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:45:32 -0400 (0:00:00.307) 0:23:36.484 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:45:32 -0400 (0:00:00.305) 0:23:36.790 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:45:33 -0400 (0:00:00.345) 0:23:37.136 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:45:33 -0400 (0:00:00.351) 0:23:37.487 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:45:35 -0400 (0:00:01.586) 0:23:39.074 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:45:36 -0400 (0:00:01.461) 0:23:40.536 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:45:37 -0400 (0:00:00.668) 0:23:41.205 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:45:37 -0400 (0:00:00.368) 0:23:41.574 ******** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:45:38 -0400 (0:00:01.412) 0:23:42.986 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:45:39 -0400 (0:00:00.632) 0:23:43.619 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:45:40 -0400 (0:00:00.595) 0:23:44.214 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:45:40 -0400 (0:00:00.664) 0:23:44.879 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:45:41 -0400 (0:00:00.478) 0:23:45.357 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:45:41 -0400 (0:00:00.442) 0:23:45.800 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:45:42 -0400 (0:00:00.758) 0:23:46.558 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:45:43 -0400 (0:00:00.820) 0:23:47.378 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:45:44 -0400 (0:00:00.666) 0:23:48.045 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:45:44 -0400 (0:00:00.644) 0:23:48.689 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:45:45 -0400 (0:00:00.698) 0:23:49.387 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:45:46 -0400 (0:00:00.671) 0:23:50.058 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:45:46 -0400 (0:00:00.718) 0:23:50.777 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:45:47 -0400 (0:00:00.757) 0:23:51.535 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:45:48 -0400 (0:00:00.506) 0:23:52.042 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:45:48 -0400 (0:00:00.656) 0:23:52.698 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:45:49 -0400 (0:00:00.686) 0:23:53.384 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:45:49 -0400 (0:00:00.464) 0:23:53.848 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:45:50 -0400 (0:00:00.662) 0:23:54.511 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:45:51 -0400 (0:00:00.754) 0:23:55.266 ******** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:45:51 -0400 (0:00:00.364) 0:23:55.630 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:45:51 -0400 (0:00:00.329) 0:23:55.960 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:45:52 -0400 (0:00:00.649) 0:23:56.609 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.028865", "end": "2025-07-30 11:45:53.479830", "rc": 0, "start": "2025-07-30 11:45:53.450965" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:45:53 -0400 (0:00:01.087) 0:23:57.697 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:45:54 -0400 (0:00:00.672) 0:23:58.369 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:45:55 -0400 (0:00:00.718) 0:23:59.088 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:45:55 -0400 (0:00:00.498) 0:23:59.587 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:45:56 -0400 (0:00:00.520) 0:24:00.108 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:45:56 -0400 (0:00:00.455) 0:24:00.563 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:45:56 -0400 (0:00:00.440) 0:24:01.004 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:45:57 -0400 (0:00:00.279) 0:24:01.284 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:45:57 -0400 (0:00:00.669) 0:24:01.954 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 30 July 2025 11:45:58 -0400 (0:00:00.274) 0:24:02.229 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:471 Wednesday 30 July 2025 11:45:59 -0400 (0:00:01.132) 0:24:03.361 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 30 July 2025 11:46:00 -0400 (0:00:00.728) 0:24:04.090 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 30 July 2025 11:46:00 -0400 (0:00:00.668) 0:24:04.758 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:46:01 -0400 (0:00:00.501) 0:24:05.259 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:46:01 -0400 (0:00:00.440) 0:24:05.700 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:46:02 -0400 (0:00:00.650) 0:24:06.350 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:46:03 -0400 (0:00:00.696) 0:24:07.046 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:46:03 -0400 (0:00:00.410) 0:24:07.456 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:46:03 -0400 (0:00:00.328) 0:24:07.785 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:46:04 -0400 (0:00:00.270) 0:24:08.056 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:46:04 -0400 (0:00:00.318) 0:24:08.374 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:46:05 -0400 (0:00:00.726) 0:24:09.101 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:46:07 -0400 (0:00:02.669) 0:24:11.771 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:46:08 -0400 (0:00:00.709) 0:24:12.481 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:46:09 -0400 (0:00:00.681) 0:24:13.163 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:46:13 -0400 (0:00:04.163) 0:24:17.326 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:46:13 -0400 (0:00:00.397) 0:24:17.724 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:46:14 -0400 (0:00:00.623) 0:24:18.347 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:46:14 -0400 (0:00:00.619) 0:24:18.967 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:46:15 -0400 (0:00:00.634) 0:24:19.601 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:46:18 -0400 (0:00:02.571) 0:24:22.173 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service": { "name": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:46:21 -0400 (0:00:02.929) 0:24:25.102 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:46:21 -0400 (0:00:00.854) 0:24:25.957 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8d0472a4\x2de7da\x2d4e74\x2dad6e\x2d553b7dfa8ada.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "name": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket cryptsetup-pre.target \"dev-mapper-foo\\\\x2dtest1.device\" \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.target\" cryptsetup.target umount.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2025-07-30 11:40:54 EDT", "StateChangeTimestampMonotonic": "2961321992", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:46:23 -0400 (0:00:01.561) 0:24:27.519 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 30 July 2025 11:46:26 -0400 (0:00:02.533) 0:24:30.052 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:46:26 -0400 (0:00:00.509) 0:24:30.561 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8d0472a4\x2de7da\x2d4e74\x2dad6e\x2d553b7dfa8ada.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "name": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454152192", "LimitMEMLOCKSoft": "454152192", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13688", "LimitNPROCSoft": "13688", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13688", "LimitSIGPENDINGSoft": "13688", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d8d0472a4\\x2de7da\\x2d4e74\\x2dad6e\\x2d553b7dfa8ada.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d8d0472a4\\\\x2de7da\\\\x2d4e74\\\\x2dad6e\\\\x2d553b7dfa8ada.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21900", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 30 July 2025 11:46:28 -0400 (0:00:01.682) 0:24:32.244 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 30 July 2025 11:46:28 -0400 (0:00:00.353) 0:24:32.598 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 30 July 2025 11:46:29 -0400 (0:00:00.500) 0:24:33.099 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 30 July 2025 11:46:29 -0400 (0:00:00.417) 0:24:33.516 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890359.0878694, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753890359.0878694, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1753890359.0878694, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "655934768", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 30 July 2025 11:46:30 -0400 (0:00:01.195) 0:24:34.712 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:494 Wednesday 30 July 2025 11:46:31 -0400 (0:00:00.467) 0:24:35.179 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:46:32 -0400 (0:00:01.095) 0:24:36.274 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:46:32 -0400 (0:00:00.538) 0:24:36.813 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:46:33 -0400 (0:00:00.731) 0:24:37.555 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:46:34 -0400 (0:00:00.747) 0:24:38.302 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:46:34 -0400 (0:00:00.339) 0:24:38.642 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:46:34 -0400 (0:00:00.327) 0:24:38.969 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:46:35 -0400 (0:00:00.300) 0:24:39.270 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:46:35 -0400 (0:00:00.300) 0:24:39.571 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:46:36 -0400 (0:00:00.830) 0:24:40.401 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:46:38 -0400 (0:00:02.609) 0:24:43.010 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:46:39 -0400 (0:00:00.737) 0:24:43.747 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:46:40 -0400 (0:00:00.668) 0:24:44.416 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:46:43 -0400 (0:00:02.784) 0:24:47.200 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:46:43 -0400 (0:00:00.672) 0:24:47.873 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:46:44 -0400 (0:00:00.681) 0:24:48.555 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:46:45 -0400 (0:00:00.644) 0:24:49.199 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:46:45 -0400 (0:00:00.581) 0:24:49.780 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:46:48 -0400 (0:00:02.559) 0:24:52.340 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:46:51 -0400 (0:00:02.962) 0:24:55.302 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:46:52 -0400 (0:00:00.744) 0:24:56.047 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:46:52 -0400 (0:00:00.246) 0:24:56.293 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-879bfa7c-f559-4913-abcc-112ef70b919d", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:47:03 -0400 (0:00:11.005) 0:25:07.299 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:47:04 -0400 (0:00:00.752) 0:25:08.052 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890248.9147234, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "dc1e2c4ddd2abbd7db9661e74fd7df0aa5f9f598", "ctime": 1753890248.9107234, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753890248.9107234, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1416, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:47:05 -0400 (0:00:01.315) 0:25:09.367 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:47:06 -0400 (0:00:01.377) 0:25:10.745 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:47:07 -0400 (0:00:00.301) 0:25:11.046 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-879bfa7c-f559-4913-abcc-112ef70b919d", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:47:07 -0400 (0:00:00.372) 0:25:11.419 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:47:07 -0400 (0:00:00.352) 0:25:11.772 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:47:08 -0400 (0:00:00.402) 0:25:12.174 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:47:09 -0400 (0:00:01.702) 0:25:13.876 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:47:11 -0400 (0:00:01.891) 0:25:15.768 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:47:13 -0400 (0:00:01.741) 0:25:17.509 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:47:14 -0400 (0:00:00.778) 0:25:18.287 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:47:16 -0400 (0:00:02.089) 0:25:20.377 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890263.1628716, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1753890254.8837855, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 318767311, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1753890254.8853202, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "111522239", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:47:17 -0400 (0:00:01.287) 0:25:21.665 ******** changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-879bfa7c-f559-4913-abcc-112ef70b919d', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-879bfa7c-f559-4913-abcc-112ef70b919d", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:47:19 -0400 (0:00:01.744) 0:25:23.409 ******** ok: [managed-node2] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:510 Wednesday 30 July 2025 11:47:21 -0400 (0:00:02.109) 0:25:25.519 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:47:22 -0400 (0:00:00.933) 0:25:26.452 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:47:22 -0400 (0:00:00.567) 0:25:27.019 ******** skipping: [managed-node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:47:23 -0400 (0:00:00.537) 0:25:27.557 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "879bfa7c-f559-4913-abcc-112ef70b919d" }, "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "size": "4G", "type": "crypt", "uuid": "f906fdfc-28ca-4a50-a077-02d9b2d75426" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qDk4s8-0dDd-7m5R-lFEb-qbmQ-JAnq-zG3wwr" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:47:24 -0400 (0:00:01.266) 0:25:28.823 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002985", "end": "2025-07-30 11:47:25.764941", "rc": 0, "start": "2025-07-30 11:47:25.761956" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:47:25 -0400 (0:00:01.199) 0:25:30.023 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002982", "end": "2025-07-30 11:47:26.955344", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:47:26.952362" } STDOUT: luks-879bfa7c-f559-4913-abcc-112ef70b919d /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:47:28 -0400 (0:00:02.522) 0:25:32.545 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 30 July 2025 11:47:29 -0400 (0:00:00.987) 0:25:33.533 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 30 July 2025 11:47:29 -0400 (0:00:00.329) 0:25:33.863 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.030343", "end": "2025-07-30 11:47:30.998928", "rc": 0, "start": "2025-07-30 11:47:30.968585" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 30 July 2025 11:47:31 -0400 (0:00:01.423) 0:25:35.286 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 30 July 2025 11:47:31 -0400 (0:00:00.499) 0:25:35.786 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 => (item=members) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 30 July 2025 11:47:32 -0400 (0:00:00.749) 0:25:36.535 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 30 July 2025 11:47:33 -0400 (0:00:00.929) 0:25:37.464 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 30 July 2025 11:47:34 -0400 (0:00:01.427) 0:25:38.891 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 30 July 2025 11:47:35 -0400 (0:00:00.729) 0:25:39.651 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 30 July 2025 11:47:36 -0400 (0:00:00.643) 0:25:40.294 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 30 July 2025 11:47:36 -0400 (0:00:00.732) 0:25:41.027 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 30 July 2025 11:47:37 -0400 (0:00:00.314) 0:25:41.341 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 30 July 2025 11:47:37 -0400 (0:00:00.650) 0:25:41.991 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 30 July 2025 11:47:38 -0400 (0:00:00.281) 0:25:42.273 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 30 July 2025 11:47:38 -0400 (0:00:00.429) 0:25:42.702 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:120453): WARNING **: 11:47:39.646: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_8.7p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.246 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.246 originally 10.31.14.246 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.246 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 30 July 2025 11:47:39 -0400 (0:00:01.311) 0:25:44.013 ******** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 30 July 2025 11:47:40 -0400 (0:00:00.756) 0:25:44.769 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 30 July 2025 11:47:41 -0400 (0:00:00.658) 0:25:45.427 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 30 July 2025 11:47:41 -0400 (0:00:00.283) 0:25:45.711 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 30 July 2025 11:47:41 -0400 (0:00:00.284) 0:25:45.996 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 30 July 2025 11:47:42 -0400 (0:00:00.301) 0:25:46.297 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 30 July 2025 11:47:42 -0400 (0:00:00.332) 0:25:46.630 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 30 July 2025 11:47:42 -0400 (0:00:00.262) 0:25:46.893 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 30 July 2025 11:47:43 -0400 (0:00:00.322) 0:25:47.216 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 30 July 2025 11:47:43 -0400 (0:00:00.399) 0:25:47.615 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 30 July 2025 11:47:43 -0400 (0:00:00.295) 0:25:47.910 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 30 July 2025 11:47:44 -0400 (0:00:00.276) 0:25:48.186 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 30 July 2025 11:47:44 -0400 (0:00:00.324) 0:25:48.510 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 30 July 2025 11:47:44 -0400 (0:00:00.295) 0:25:48.806 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 30 July 2025 11:47:45 -0400 (0:00:00.758) 0:25:49.564 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 30 July 2025 11:47:46 -0400 (0:00:00.699) 0:25:50.263 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 30 July 2025 11:47:46 -0400 (0:00:00.388) 0:25:50.652 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 30 July 2025 11:47:47 -0400 (0:00:00.383) 0:25:51.036 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 30 July 2025 11:47:47 -0400 (0:00:00.369) 0:25:51.405 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 30 July 2025 11:47:47 -0400 (0:00:00.366) 0:25:51.772 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 30 July 2025 11:47:48 -0400 (0:00:00.321) 0:25:52.094 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 30 July 2025 11:47:48 -0400 (0:00:00.311) 0:25:52.406 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 30 July 2025 11:47:48 -0400 (0:00:00.338) 0:25:52.744 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 30 July 2025 11:47:49 -0400 (0:00:00.590) 0:25:53.335 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 30 July 2025 11:47:49 -0400 (0:00:00.573) 0:25:53.908 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 30 July 2025 11:47:50 -0400 (0:00:00.290) 0:25:54.199 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 30 July 2025 11:47:50 -0400 (0:00:00.302) 0:25:54.501 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 30 July 2025 11:47:50 -0400 (0:00:00.288) 0:25:54.790 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 30 July 2025 11:47:51 -0400 (0:00:00.296) 0:25:55.088 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 30 July 2025 11:47:51 -0400 (0:00:00.781) 0:25:55.870 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 30 July 2025 11:47:52 -0400 (0:00:00.679) 0:25:56.549 ******** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 30 July 2025 11:47:52 -0400 (0:00:00.300) 0:25:56.850 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 30 July 2025 11:47:53 -0400 (0:00:00.630) 0:25:57.480 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 30 July 2025 11:47:54 -0400 (0:00:00.766) 0:25:58.247 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 30 July 2025 11:47:54 -0400 (0:00:00.652) 0:25:58.900 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 30 July 2025 11:47:55 -0400 (0:00:00.653) 0:25:59.554 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 30 July 2025 11:47:56 -0400 (0:00:00.781) 0:26:00.335 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 30 July 2025 11:47:56 -0400 (0:00:00.562) 0:26:00.897 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 30 July 2025 11:47:57 -0400 (0:00:00.338) 0:26:01.235 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 30 July 2025 11:47:57 -0400 (0:00:00.290) 0:26:01.525 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 30 July 2025 11:47:58 -0400 (0:00:00.781) 0:26:02.307 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 30 July 2025 11:47:59 -0400 (0:00:00.740) 0:26:03.047 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 30 July 2025 11:47:59 -0400 (0:00:00.397) 0:26:03.445 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 30 July 2025 11:47:59 -0400 (0:00:00.315) 0:26:03.760 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 30 July 2025 11:47:59 -0400 (0:00:00.261) 0:26:04.022 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 30 July 2025 11:48:00 -0400 (0:00:00.398) 0:26:04.420 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 30 July 2025 11:48:00 -0400 (0:00:00.276) 0:26:04.696 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 30 July 2025 11:48:00 -0400 (0:00:00.263) 0:26:04.960 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 30 July 2025 11:48:01 -0400 (0:00:00.284) 0:26:05.244 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 30 July 2025 11:48:02 -0400 (0:00:00.964) 0:26:06.208 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 30 July 2025 11:48:02 -0400 (0:00:00.320) 0:26:06.528 ******** skipping: [managed-node2] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 30 July 2025 11:48:02 -0400 (0:00:00.334) 0:26:06.863 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 30 July 2025 11:48:03 -0400 (0:00:00.302) 0:26:07.171 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 30 July 2025 11:48:03 -0400 (0:00:00.272) 0:26:07.444 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 30 July 2025 11:48:03 -0400 (0:00:00.288) 0:26:07.732 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 30 July 2025 11:48:04 -0400 (0:00:00.307) 0:26:08.039 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 30 July 2025 11:48:05 -0400 (0:00:01.677) 0:26:09.717 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 30 July 2025 11:48:06 -0400 (0:00:00.335) 0:26:10.052 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:48:06 -0400 (0:00:00.534) 0:26:10.586 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:48:07 -0400 (0:00:00.616) 0:26:11.202 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:48:09 -0400 (0:00:02.003) 0:26:13.206 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:48:09 -0400 (0:00:00.302) 0:26:13.509 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:48:10 -0400 (0:00:00.755) 0:26:14.265 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:48:11 -0400 (0:00:00.841) 0:26:15.107 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:48:11 -0400 (0:00:00.389) 0:26:15.497 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:48:12 -0400 (0:00:00.753) 0:26:16.250 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:48:12 -0400 (0:00:00.757) 0:26:17.008 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:48:13 -0400 (0:00:00.820) 0:26:17.828 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:48:14 -0400 (0:00:00.262) 0:26:18.091 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:48:14 -0400 (0:00:00.279) 0:26:18.370 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:48:14 -0400 (0:00:00.304) 0:26:18.675 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:48:14 -0400 (0:00:00.350) 0:26:19.025 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:48:16 -0400 (0:00:01.182) 0:26:20.208 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:48:16 -0400 (0:00:00.741) 0:26:20.949 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:48:17 -0400 (0:00:00.748) 0:26:21.698 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:48:18 -0400 (0:00:00.625) 0:26:22.323 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:48:18 -0400 (0:00:00.612) 0:26:22.936 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:48:19 -0400 (0:00:00.330) 0:26:23.266 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:48:20 -0400 (0:00:00.898) 0:26:24.165 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:48:20 -0400 (0:00:00.856) 0:26:25.021 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890422.7255316, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753890422.7255316, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2034, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753890422.7255316, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:48:22 -0400 (0:00:01.213) 0:26:26.235 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:48:22 -0400 (0:00:00.302) 0:26:26.538 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:48:22 -0400 (0:00:00.239) 0:26:26.778 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:48:23 -0400 (0:00:00.417) 0:26:27.195 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:48:23 -0400 (0:00:00.376) 0:26:27.571 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:48:23 -0400 (0:00:00.272) 0:26:27.843 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:48:24 -0400 (0:00:00.335) 0:26:28.179 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890422.969534, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753890422.969534, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2110, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1753890422.969534, "nlink": 1, "path": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:48:25 -0400 (0:00:01.238) 0:26:29.418 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:48:27 -0400 (0:00:02.490) 0:26:31.908 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.007901", "end": "2025-07-30 11:48:28.866728", "rc": 0, "start": "2025-07-30 11:48:28.858827" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 879bfa7c-f559-4913-abcc-112ef70b919d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 679128 Threads: 2 Salt: 9e a8 41 c6 d2 38 f2 1d 8b 6a 9b fa 35 8a 2e 3e ba 9c 0b 82 7e 93 9f ca 74 51 14 ff f4 74 81 0f AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 129774 Salt: 49 a1 1a 7d dc 71 dd 85 38 32 71 62 96 4e e1 26 e9 f6 9d 21 c5 83 9d f2 b1 16 0f 7c ac 25 3d 22 Digest: 31 62 aa e1 b5 9b d9 77 00 59 a9 9f 4e c3 32 42 3e 12 75 a4 9e 42 3d d5 81 1e 3c a4 40 bb 9c c9 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:48:29 -0400 (0:00:01.224) 0:26:33.133 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:48:29 -0400 (0:00:00.679) 0:26:33.812 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:48:30 -0400 (0:00:00.818) 0:26:34.631 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:48:30 -0400 (0:00:00.362) 0:26:34.993 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:48:31 -0400 (0:00:00.396) 0:26:35.390 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:48:31 -0400 (0:00:00.424) 0:26:35.815 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:48:32 -0400 (0:00:00.390) 0:26:36.205 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:48:32 -0400 (0:00:00.377) 0:26:36.583 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-879bfa7c-f559-4913-abcc-112ef70b919d /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:48:33 -0400 (0:00:00.868) 0:26:37.451 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:48:34 -0400 (0:00:00.691) 0:26:38.143 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:48:34 -0400 (0:00:00.791) 0:26:38.934 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:48:35 -0400 (0:00:00.795) 0:26:39.730 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:48:36 -0400 (0:00:00.752) 0:26:40.482 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:48:36 -0400 (0:00:00.301) 0:26:40.784 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:48:37 -0400 (0:00:00.252) 0:26:41.037 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:48:37 -0400 (0:00:00.263) 0:26:41.300 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:48:37 -0400 (0:00:00.293) 0:26:41.593 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:48:37 -0400 (0:00:00.311) 0:26:41.905 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:48:38 -0400 (0:00:00.279) 0:26:42.184 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:48:38 -0400 (0:00:00.295) 0:26:42.480 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:48:38 -0400 (0:00:00.310) 0:26:42.791 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:48:39 -0400 (0:00:00.303) 0:26:43.094 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:48:39 -0400 (0:00:00.280) 0:26:43.375 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:48:39 -0400 (0:00:00.288) 0:26:43.664 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:48:41 -0400 (0:00:01.572) 0:26:45.236 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:48:42 -0400 (0:00:01.714) 0:26:46.950 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:48:43 -0400 (0:00:00.672) 0:26:47.623 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:48:43 -0400 (0:00:00.334) 0:26:47.958 ******** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:48:45 -0400 (0:00:01.470) 0:26:49.428 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:48:46 -0400 (0:00:00.655) 0:26:50.084 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:48:46 -0400 (0:00:00.638) 0:26:50.722 ******** skipping: [managed-node2] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:48:47 -0400 (0:00:00.767) 0:26:51.504 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:48:48 -0400 (0:00:00.833) 0:26:52.338 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:48:48 -0400 (0:00:00.475) 0:26:52.813 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:48:49 -0400 (0:00:00.640) 0:26:53.453 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:48:50 -0400 (0:00:00.757) 0:26:54.210 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:48:50 -0400 (0:00:00.612) 0:26:54.823 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:48:51 -0400 (0:00:00.761) 0:26:55.584 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:48:52 -0400 (0:00:00.710) 0:26:56.295 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:48:52 -0400 (0:00:00.562) 0:26:56.858 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:48:53 -0400 (0:00:00.761) 0:26:57.619 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:48:54 -0400 (0:00:00.723) 0:26:58.343 ******** skipping: [managed-node2] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:48:54 -0400 (0:00:00.675) 0:26:59.018 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:48:55 -0400 (0:00:00.763) 0:26:59.782 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:48:56 -0400 (0:00:00.659) 0:27:00.441 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:48:56 -0400 (0:00:00.576) 0:27:01.018 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:48:57 -0400 (0:00:00.640) 0:27:01.659 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:48:58 -0400 (0:00:00.669) 0:27:02.329 ******** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:48:58 -0400 (0:00:00.317) 0:27:02.646 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:48:58 -0400 (0:00:00.297) 0:27:02.944 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:48:59 -0400 (0:00:00.895) 0:27:03.839 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.032078", "end": "2025-07-30 11:49:00.867902", "rc": 0, "start": "2025-07-30 11:49:00.835824" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:49:01 -0400 (0:00:01.266) 0:27:05.106 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:49:01 -0400 (0:00:00.680) 0:27:05.786 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:49:02 -0400 (0:00:00.786) 0:27:06.573 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:49:03 -0400 (0:00:00.646) 0:27:07.220 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:49:03 -0400 (0:00:00.594) 0:27:07.814 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:49:04 -0400 (0:00:00.570) 0:27:08.385 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:49:04 -0400 (0:00:00.470) 0:27:08.857 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:49:05 -0400 (0:00:00.252) 0:27:09.109 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:49:05 -0400 (0:00:00.507) 0:27:09.616 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:513 Wednesday 30 July 2025 11:49:05 -0400 (0:00:00.340) 0:27:09.956 ******** included: fedora.linux_system_roles.storage for managed-node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 30 July 2025 11:49:07 -0400 (0:00:01.335) 0:27:11.292 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 30 July 2025 11:49:07 -0400 (0:00:00.468) 0:27:11.760 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 30 July 2025 11:49:11 -0400 (0:00:03.543) 0:27:15.303 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 30 July 2025 11:49:12 -0400 (0:00:00.756) 0:27:16.060 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 30 July 2025 11:49:12 -0400 (0:00:00.368) 0:27:16.428 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 30 July 2025 11:49:12 -0400 (0:00:00.335) 0:27:16.764 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 30 July 2025 11:49:13 -0400 (0:00:00.335) 0:27:17.104 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 30 July 2025 11:49:13 -0400 (0:00:00.299) 0:27:17.404 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 30 July 2025 11:49:14 -0400 (0:00:00.853) 0:27:18.257 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 30 July 2025 11:49:16 -0400 (0:00:02.686) 0:27:20.943 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 30 July 2025 11:49:17 -0400 (0:00:00.580) 0:27:21.524 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 30 July 2025 11:49:18 -0400 (0:00:00.719) 0:27:22.243 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 30 July 2025 11:49:20 -0400 (0:00:02.697) 0:27:24.941 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 30 July 2025 11:49:21 -0400 (0:00:00.733) 0:27:25.675 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 30 July 2025 11:49:22 -0400 (0:00:00.562) 0:27:26.238 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 30 July 2025 11:49:22 -0400 (0:00:00.589) 0:27:26.827 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 30 July 2025 11:49:23 -0400 (0:00:00.631) 0:27:27.459 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Wednesday 30 July 2025 11:49:26 -0400 (0:00:02.728) 0:27:30.187 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 30 July 2025 11:49:30 -0400 (0:00:04.155) 0:27:34.343 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 30 July 2025 11:49:31 -0400 (0:00:01.011) 0:27:35.354 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 30 July 2025 11:49:31 -0400 (0:00:00.255) 0:27:35.610 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-879bfa7c-f559-4913-abcc-112ef70b919d", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=qDk4s8-0dDd-7m5R-lFEb-qbmQ-JAnq-zG3wwr", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 30 July 2025 11:49:34 -0400 (0:00:03.201) 0:27:38.811 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 30 July 2025 11:49:35 -0400 (0:00:00.640) 0:27:39.452 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890433.2186406, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "bedfc2f674c9a4066194d2c2c339167e56b27dad", "ctime": 1753890433.2156405, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 109052114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1753890433.2156405, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1448, "uid": 0, "version": "420080612", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 30 July 2025 11:49:36 -0400 (0:00:01.344) 0:27:40.797 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 30 July 2025 11:49:38 -0400 (0:00:01.353) 0:27:42.150 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 30 July 2025 11:49:38 -0400 (0:00:00.400) 0:27:42.551 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-879bfa7c-f559-4913-abcc-112ef70b919d", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=qDk4s8-0dDd-7m5R-lFEb-qbmQ-JAnq-zG3wwr", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 30 July 2025 11:49:38 -0400 (0:00:00.453) 0:27:43.004 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 30 July 2025 11:49:39 -0400 (0:00:00.345) 0:27:43.350 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=qDk4s8-0dDd-7m5R-lFEb-qbmQ-JAnq-zG3wwr", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 30 July 2025 11:49:39 -0400 (0:00:00.351) 0:27:43.701 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node2] => (item={'src': '/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-879bfa7c-f559-4913-abcc-112ef70b919d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 30 July 2025 11:49:41 -0400 (0:00:01.809) 0:27:45.510 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 30 July 2025 11:49:43 -0400 (0:00:01.915) 0:27:47.426 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 30 July 2025 11:49:44 -0400 (0:00:00.815) 0:27:48.241 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 30 July 2025 11:49:44 -0400 (0:00:00.621) 0:27:48.864 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 30 July 2025 11:49:46 -0400 (0:00:01.842) 0:27:50.706 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890446.9537835, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "09d1efeb6b2aacd945afdb99b863a037fa43d5b5", "ctime": 1753890439.1747026, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 612368766, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1753890439.1763673, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "996620721", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 30 July 2025 11:49:47 -0400 (0:00:01.183) 0:27:51.889 ******** changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-879bfa7c-f559-4913-abcc-112ef70b919d', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-879bfa7c-f559-4913-abcc-112ef70b919d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 30 July 2025 11:49:49 -0400 (0:00:01.684) 0:27:53.574 ******** ok: [managed-node2] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:523 Wednesday 30 July 2025 11:49:51 -0400 (0:00:02.116) 0:27:55.691 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 30 July 2025 11:49:52 -0400 (0:00:00.906) 0:27:56.598 ******** skipping: [managed-node2] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 30 July 2025 11:49:53 -0400 (0:00:00.627) 0:27:57.225 ******** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=qDk4s8-0dDd-7m5R-lFEb-qbmQ-JAnq-zG3wwr", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 30 July 2025 11:49:54 -0400 (0:00:00.871) 0:27:58.096 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "837f348d-d449-40fc-836e-4b14996e5e86" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 30 July 2025 11:49:55 -0400 (0:00:01.159) 0:27:59.255 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003065", "end": "2025-07-30 11:49:56.184314", "rc": 0, "start": "2025-07-30 11:49:56.181249" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Jul 22 13:28:35 2025 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=837f348d-d449-40fc-836e-4b14996e5e86 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 30 July 2025 11:49:56 -0400 (0:00:01.205) 0:28:00.461 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003643", "end": "2025-07-30 11:49:57.427698", "failed_when_result": false, "rc": 0, "start": "2025-07-30 11:49:57.424055" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 30 July 2025 11:49:57 -0400 (0:00:01.271) 0:28:01.732 ******** skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 30 July 2025 11:49:58 -0400 (0:00:00.494) 0:28:02.227 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'lvmpv', 'mount_options': 'defaults', 'mount_point': None, 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'absent', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=qDk4s8-0dDd-7m5R-lFEb-qbmQ-JAnq-zG3wwr'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 30 July 2025 11:49:59 -0400 (0:00:00.836) 0:28:03.063 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 30 July 2025 11:49:59 -0400 (0:00:00.589) 0:28:03.653 ******** included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 => (item=mount) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 => (item=fstab) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 => (item=fs) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 => (item=device) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 => (item=encryption) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 => (item=md) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 => (item=size) included: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 30 July 2025 11:50:01 -0400 (0:00:01.882) 0:28:05.536 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 30 July 2025 11:50:01 -0400 (0:00:00.466) 0:28:06.003 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 30 July 2025 11:50:02 -0400 (0:00:00.843) 0:28:06.846 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 30 July 2025 11:50:03 -0400 (0:00:00.566) 0:28:07.412 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 30 July 2025 11:50:03 -0400 (0:00:00.293) 0:28:07.706 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 30 July 2025 11:50:03 -0400 (0:00:00.270) 0:28:07.976 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 30 July 2025 11:50:04 -0400 (0:00:00.263) 0:28:08.240 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 30 July 2025 11:50:04 -0400 (0:00:00.275) 0:28:08.516 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 30 July 2025 11:50:04 -0400 (0:00:00.290) 0:28:08.806 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 30 July 2025 11:50:05 -0400 (0:00:00.338) 0:28:09.144 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 30 July 2025 11:50:05 -0400 (0:00:00.320) 0:28:09.465 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 30 July 2025 11:50:05 -0400 (0:00:00.433) 0:28:09.898 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 30 July 2025 11:50:06 -0400 (0:00:00.904) 0:28:10.803 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 30 July 2025 11:50:07 -0400 (0:00:00.518) 0:28:11.321 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 30 July 2025 11:50:07 -0400 (0:00:00.631) 0:28:11.953 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 30 July 2025 11:50:08 -0400 (0:00:00.611) 0:28:12.565 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 30 July 2025 11:50:09 -0400 (0:00:00.773) 0:28:13.338 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 30 July 2025 11:50:09 -0400 (0:00:00.345) 0:28:13.684 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 30 July 2025 11:50:10 -0400 (0:00:00.475) 0:28:14.160 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 30 July 2025 11:50:10 -0400 (0:00:00.561) 0:28:14.721 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1753890574.4141102, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1753890574.4141102, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 451, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1753890574.4141102, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 30 July 2025 11:50:11 -0400 (0:00:01.234) 0:28:15.956 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 30 July 2025 11:50:12 -0400 (0:00:00.334) 0:28:16.290 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 30 July 2025 11:50:12 -0400 (0:00:00.296) 0:28:16.586 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 30 July 2025 11:50:12 -0400 (0:00:00.296) 0:28:16.882 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 30 July 2025 11:50:13 -0400 (0:00:00.330) 0:28:17.213 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 30 July 2025 11:50:13 -0400 (0:00:00.254) 0:28:17.468 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 30 July 2025 11:50:13 -0400 (0:00:00.208) 0:28:17.676 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 30 July 2025 11:50:13 -0400 (0:00:00.230) 0:28:17.906 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 30 July 2025 11:50:16 -0400 (0:00:02.593) 0:28:20.500 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 30 July 2025 11:50:16 -0400 (0:00:00.301) 0:28:20.803 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 30 July 2025 11:50:18 -0400 (0:00:01.650) 0:28:22.453 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 30 July 2025 11:50:18 -0400 (0:00:00.276) 0:28:22.730 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 30 July 2025 11:50:18 -0400 (0:00:00.268) 0:28:22.999 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 30 July 2025 11:50:19 -0400 (0:00:00.273) 0:28:23.273 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 30 July 2025 11:50:19 -0400 (0:00:00.248) 0:28:23.522 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 30 July 2025 11:50:19 -0400 (0:00:00.248) 0:28:23.770 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 30 July 2025 11:50:19 -0400 (0:00:00.238) 0:28:24.009 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 30 July 2025 11:50:20 -0400 (0:00:00.452) 0:28:24.462 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 30 July 2025 11:50:20 -0400 (0:00:00.516) 0:28:24.978 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 30 July 2025 11:50:21 -0400 (0:00:00.586) 0:28:25.565 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 30 July 2025 11:50:22 -0400 (0:00:00.544) 0:28:26.109 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 30 July 2025 11:50:22 -0400 (0:00:00.660) 0:28:26.770 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 30 July 2025 11:50:23 -0400 (0:00:00.308) 0:28:27.079 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 30 July 2025 11:50:23 -0400 (0:00:00.271) 0:28:27.350 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 30 July 2025 11:50:23 -0400 (0:00:00.302) 0:28:27.653 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 30 July 2025 11:50:23 -0400 (0:00:00.365) 0:28:28.019 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 30 July 2025 11:50:24 -0400 (0:00:00.217) 0:28:28.236 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 30 July 2025 11:50:24 -0400 (0:00:00.258) 0:28:28.495 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 30 July 2025 11:50:24 -0400 (0:00:00.308) 0:28:28.803 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 30 July 2025 11:50:24 -0400 (0:00:00.228) 0:28:29.032 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 30 July 2025 11:50:25 -0400 (0:00:00.197) 0:28:29.229 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 30 July 2025 11:50:25 -0400 (0:00:00.292) 0:28:29.521 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 30 July 2025 11:50:25 -0400 (0:00:00.235) 0:28:29.757 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 30 July 2025 11:50:26 -0400 (0:00:00.548) 0:28:30.305 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 30 July 2025 11:50:26 -0400 (0:00:00.674) 0:28:30.979 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 30 July 2025 11:50:27 -0400 (0:00:00.596) 0:28:31.575 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 30 July 2025 11:50:27 -0400 (0:00:00.418) 0:28:31.994 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 30 July 2025 11:50:28 -0400 (0:00:00.578) 0:28:32.573 ******** skipping: [managed-node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 30 July 2025 11:50:29 -0400 (0:00:00.523) 0:28:33.096 ******** skipping: [managed-node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 30 July 2025 11:50:29 -0400 (0:00:00.526) 0:28:33.623 ******** skipping: [managed-node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 30 July 2025 11:50:30 -0400 (0:00:00.710) 0:28:34.333 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 30 July 2025 11:50:30 -0400 (0:00:00.612) 0:28:34.946 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 30 July 2025 11:50:31 -0400 (0:00:00.454) 0:28:35.401 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 30 July 2025 11:50:31 -0400 (0:00:00.282) 0:28:35.683 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 30 July 2025 11:50:31 -0400 (0:00:00.342) 0:28:36.025 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 30 July 2025 11:50:32 -0400 (0:00:00.416) 0:28:36.441 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 30 July 2025 11:50:32 -0400 (0:00:00.501) 0:28:36.943 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 30 July 2025 11:50:33 -0400 (0:00:00.305) 0:28:37.248 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 30 July 2025 11:50:33 -0400 (0:00:00.369) 0:28:37.618 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 30 July 2025 11:50:33 -0400 (0:00:00.381) 0:28:38.000 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 30 July 2025 11:50:34 -0400 (0:00:00.376) 0:28:38.376 ******** skipping: [managed-node2] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 30 July 2025 11:50:34 -0400 (0:00:00.401) 0:28:38.777 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 30 July 2025 11:50:35 -0400 (0:00:00.331) 0:28:39.109 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 30 July 2025 11:50:35 -0400 (0:00:00.292) 0:28:39.401 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 30 July 2025 11:50:35 -0400 (0:00:00.276) 0:28:39.681 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 30 July 2025 11:50:35 -0400 (0:00:00.346) 0:28:40.027 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 30 July 2025 11:50:36 -0400 (0:00:00.299) 0:28:40.327 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 30 July 2025 11:50:36 -0400 (0:00:00.362) 0:28:40.690 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 30 July 2025 11:50:36 -0400 (0:00:00.329) 0:28:41.019 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 30 July 2025 11:50:37 -0400 (0:00:00.513) 0:28:41.533 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 30 July 2025 11:50:37 -0400 (0:00:00.305) 0:28:41.838 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 30 July 2025 11:50:38 -0400 (0:00:00.264) 0:28:42.103 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 30 July 2025 11:50:38 -0400 (0:00:00.251) 0:28:42.354 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 30 July 2025 11:50:38 -0400 (0:00:00.276) 0:28:42.631 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 30 July 2025 11:50:38 -0400 (0:00:00.283) 0:28:42.914 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 30 July 2025 11:50:39 -0400 (0:00:00.221) 0:28:43.136 ******** skipping: [managed-node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 30 July 2025 11:50:39 -0400 (0:00:00.301) 0:28:43.437 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 30 July 2025 11:50:39 -0400 (0:00:00.273) 0:28:43.711 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } PLAY RECAP ********************************************************************* managed-node2 : ok=1245 changed=60 unreachable=0 failed=0 skipped=1073 rescued=18 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:23:08.748852+00:00Z", "host": "managed-node2", "message": "encrypted volume 'foo' missing key/password", "start_time": "2025-07-30T15:23:06.741219+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:23:09.064135+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-07-30T15:23:08.835561+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:25:25.594953+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9' in safe mode due to encryption removal", "start_time": "2025-07-30T15:25:23.253228+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:25:26.046046+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-453b1ac7-19a4-4c29-b50b-1994fb6b8ca9' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-07-30T15:25:25.759322+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:27:24.861335+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2025-07-30T15:27:22.525304+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:27:25.284770+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-07-30T15:27:24.995461+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:29:34.235783+00:00Z", "host": "managed-node2", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-07-30T15:29:31.781041+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:29:34.713573+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-07-30T15:29:34.427996+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:32:19.506408+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da' in safe mode due to encryption removal", "start_time": "2025-07-30T15:32:17.191899+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:32:19.932656+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-7f16eb98-ed03-46aa-bb0a-8abc8caee7da' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-07-30T15:32:19.616493+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:34:52.599351+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2025-07-30T15:34:50.121894+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:34:53.068909+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-07-30T15:34:52.739906+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:37:40.304877+00:00Z", "host": "managed-node2", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-07-30T15:37:37.862383+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:37:40.679440+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-07-30T15:37:40.451288+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:43:22.803954+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada' in safe mode due to encryption removal", "start_time": "2025-07-30T15:43:20.122422+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:43:23.254267+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-8d0472a4-e7da-4e74-ad6e-553b7dfa8ada' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-07-30T15:43:22.933688+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:46:25.862608+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2025-07-30T15:46:23.494350+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.13", "end_time": "2025-07-30T15:46:26.336095+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-07-30T15:46:26.021154+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Wednesday 30 July 2025 11:50:39 -0400 (0:00:00.236) 0:28:43.948 ******** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.99s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.55s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.01s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.00s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.59s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 9.26s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab --- 5.52s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Gathering Facts --------------------------------------------------------- 5.30s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 fedora.linux_system_roles.storage : Make sure blivet is available ------- 4.31s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Get service facts ------------------- 4.27s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Get service facts ------------------- 4.26s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Get the canonical device path for each member device -------------------- 4.18s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 fedora.linux_system_roles.storage : Get required packages --------------- 4.16s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get service facts ------------------- 4.16s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Get service facts ------------------- 4.12s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Get service facts ------------------- 4.04s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Get service facts ------------------- 4.00s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Get service facts ------------------- 3.81s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Write the key into the key file ----------------------------------------- 3.79s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:319 fedora.linux_system_roles.storage : Update facts ------------------------ 3.72s /tmp/collections-gHt/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224